samitizerxu commited on
Commit
3b472ed
1 Parent(s): 780d808

Added model files

Browse files
README.md ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - zh-CN
4
+ license: apache-2.0
5
+ tags:
6
+ - automatic-speech-recognition
7
+ - common_voice
8
+ - generated_from_trainer
9
+ datasets:
10
+ - common_voice
11
+ model-index:
12
+ - name: wav2vec2-xls-r-300m-zh-CN
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # wav2vec2-xls-r-300m-zh-CN
20
+
21
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the COMMON_VOICE - ZH-CN dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.8828
24
+ - Wer: 2.0604
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 7.5e-05
44
+ - train_batch_size: 8
45
+ - eval_batch_size: 8
46
+ - seed: 42
47
+ - gradient_accumulation_steps: 4
48
+ - total_train_batch_size: 32
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_steps: 2000
52
+ - num_epochs: 50.0
53
+ - mixed_precision_training: Native AMP
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
58
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
59
+ | 60.2112 | 0.74 | 500 | 64.8189 | 1.0 |
60
+ | 8.1128 | 1.48 | 1000 | 6.8997 | 1.0 |
61
+ | 6.0492 | 2.22 | 1500 | 5.9677 | 1.9495 |
62
+ | 5.9326 | 2.95 | 2000 | 5.8845 | 1.4092 |
63
+ | 5.8763 | 3.69 | 2500 | 5.8460 | 1.6126 |
64
+ | 5.7888 | 4.43 | 3000 | 5.7545 | 2.2034 |
65
+ | 5.735 | 5.17 | 3500 | 5.6777 | 2.3350 |
66
+ | 5.6861 | 5.91 | 4000 | 5.5179 | 2.2232 |
67
+ | 5.381 | 6.65 | 4500 | 5.1420 | 2.1816 |
68
+ | 4.625 | 7.39 | 5000 | 3.9020 | 2.0722 |
69
+ | 4.214 | 8.12 | 5500 | 3.3394 | 2.1430 |
70
+ | 3.8992 | 8.86 | 6000 | 2.9085 | 2.1534 |
71
+ | 3.6481 | 9.6 | 6500 | 2.6208 | 2.3538 |
72
+ | 3.4658 | 10.34 | 7000 | 2.3172 | 2.2271 |
73
+ | 3.257 | 11.08 | 7500 | 2.0916 | 2.1351 |
74
+ | 3.1294 | 11.82 | 8000 | 1.8954 | 2.2133 |
75
+ | 3.0266 | 12.56 | 8500 | 1.7673 | 2.0896 |
76
+ | 2.9451 | 13.29 | 9000 | 1.6659 | 2.1381 |
77
+ | 2.8802 | 14.03 | 9500 | 1.5637 | 2.1969 |
78
+ | 2.78 | 14.77 | 10000 | 1.4921 | 2.2335 |
79
+ | 2.7049 | 15.51 | 10500 | 1.4132 | 2.2217 |
80
+ | 2.6768 | 16.25 | 11000 | 1.3667 | 2.2232 |
81
+ | 2.6358 | 16.99 | 11500 | 1.3111 | 2.1286 |
82
+ | 2.5802 | 17.72 | 12000 | 1.2679 | 2.1430 |
83
+ | 2.5012 | 18.46 | 12500 | 1.2365 | 2.1153 |
84
+ | 2.458 | 19.2 | 13000 | 1.2118 | 2.1573 |
85
+ | 2.4433 | 19.94 | 13500 | 1.1992 | 2.1336 |
86
+ | 2.438 | 20.68 | 14000 | 1.1803 | 2.1509 |
87
+ | 2.418 | 21.42 | 14500 | 1.1601 | 2.1232 |
88
+ | 2.3322 | 22.16 | 15000 | 1.1418 | 2.1930 |
89
+ | 2.3387 | 22.89 | 15500 | 1.1172 | 2.2464 |
90
+ | 2.3349 | 23.63 | 16000 | 1.1144 | 2.1856 |
91
+ | 2.291 | 24.37 | 16500 | 1.1018 | 2.1930 |
92
+ | 2.2766 | 25.11 | 17000 | 1.0883 | 2.1762 |
93
+ | 2.2534 | 25.85 | 17500 | 1.0744 | 2.1875 |
94
+ | 2.2393 | 26.59 | 18000 | 1.0561 | 2.1846 |
95
+ | 2.2085 | 27.33 | 18500 | 1.0466 | 2.1445 |
96
+ | 2.1966 | 28.06 | 19000 | 1.0382 | 2.1089 |
97
+ | 2.1794 | 28.8 | 19500 | 1.0264 | 1.9861 |
98
+ | 2.1423 | 29.54 | 20000 | 1.0246 | 1.9678 |
99
+ | 2.1649 | 30.28 | 20500 | 0.9982 | 2.0005 |
100
+ | 2.143 | 31.02 | 21000 | 0.9985 | 2.0450 |
101
+ | 2.1338 | 31.76 | 21500 | 0.9932 | 2.0025 |
102
+ | 2.1076 | 32.5 | 22000 | 0.9903 | 2.0505 |
103
+ | 2.0519 | 33.23 | 22500 | 0.9834 | 2.0737 |
104
+ | 2.0534 | 33.97 | 23000 | 0.9756 | 2.0247 |
105
+ | 2.0121 | 34.71 | 23500 | 0.9688 | 2.1440 |
106
+ | 2.0161 | 35.45 | 24000 | 0.9582 | 2.1232 |
107
+ | 2.0178 | 36.19 | 24500 | 0.9480 | 2.0896 |
108
+ | 2.0154 | 36.93 | 25000 | 0.9483 | 2.0787 |
109
+ | 1.9966 | 37.67 | 25500 | 0.9406 | 2.0297 |
110
+ | 1.9753 | 38.4 | 26000 | 0.9419 | 2.0346 |
111
+ | 1.9524 | 39.14 | 26500 | 0.9274 | 2.0698 |
112
+ | 1.9427 | 39.88 | 27000 | 0.9233 | 2.0787 |
113
+ | 1.9258 | 40.62 | 27500 | 0.9182 | 2.0529 |
114
+ | 1.9031 | 41.36 | 28000 | 0.9150 | 2.0787 |
115
+ | 1.9297 | 42.1 | 28500 | 0.9040 | 2.0505 |
116
+ | 1.9041 | 42.84 | 29000 | 0.9009 | 2.0579 |
117
+ | 1.8929 | 43.57 | 29500 | 0.8968 | 2.0327 |
118
+ | 1.9077 | 44.31 | 30000 | 0.8954 | 2.0619 |
119
+ | 1.8504 | 45.05 | 30500 | 0.8922 | 2.0737 |
120
+ | 1.8732 | 45.79 | 31000 | 0.8898 | 2.0683 |
121
+ | 1.877 | 46.53 | 31500 | 0.8849 | 2.0589 |
122
+ | 1.8587 | 47.27 | 32000 | 0.8843 | 2.0450 |
123
+ | 1.8236 | 48.01 | 32500 | 0.8810 | 2.0554 |
124
+ | 1.8392 | 48.74 | 33000 | 0.8820 | 2.0574 |
125
+ | 1.8428 | 49.48 | 33500 | 0.8816 | 2.0668 |
126
+
127
+
128
+ ### Framework versions
129
+
130
+ - Transformers 4.17.0.dev0
131
+ - Pytorch 1.10.2+cu102
132
+ - Datasets 1.18.2.dev0
133
+ - Tokenizers 0.11.0
added_tokens.json ADDED
@@ -0,0 +1 @@
 
1
+ {"<s>": 4650, "</s>": 4651}
all_results.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 50.0,
3
+ "eval_loss": 0.8828312158584595,
4
+ "eval_runtime": 118.6394,
5
+ "eval_samples": 2021,
6
+ "eval_samples_per_second": 17.035,
7
+ "eval_steps_per_second": 2.133,
8
+ "eval_wer": 2.060366155368629,
9
+ "train_loss": 4.34445102640938,
10
+ "train_runtime": 69888.62,
11
+ "train_samples": 21672,
12
+ "train_samples_per_second": 15.505,
13
+ "train_steps_per_second": 0.484
14
+ }
checkpoint-32500/config.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
+ "activation_dropout": 0.1,
4
+ "adapter_kernel_size": 3,
5
+ "adapter_stride": 2,
6
+ "add_adapter": false,
7
+ "apply_spec_augment": true,
8
+ "architectures": [
9
+ "Wav2Vec2ForCTC"
10
+ ],
11
+ "attention_dropout": 0.0,
12
+ "bos_token_id": 1,
13
+ "classifier_proj_size": 256,
14
+ "codevector_dim": 768,
15
+ "contrastive_logits_temperature": 0.1,
16
+ "conv_bias": true,
17
+ "conv_dim": [
18
+ 512,
19
+ 512,
20
+ 512,
21
+ 512,
22
+ 512,
23
+ 512,
24
+ 512
25
+ ],
26
+ "conv_kernel": [
27
+ 10,
28
+ 3,
29
+ 3,
30
+ 3,
31
+ 3,
32
+ 2,
33
+ 2
34
+ ],
35
+ "conv_stride": [
36
+ 5,
37
+ 2,
38
+ 2,
39
+ 2,
40
+ 2,
41
+ 2,
42
+ 2
43
+ ],
44
+ "ctc_loss_reduction": "mean",
45
+ "ctc_zero_infinity": false,
46
+ "diversity_loss_weight": 0.1,
47
+ "do_stable_layer_norm": true,
48
+ "eos_token_id": 2,
49
+ "feat_extract_activation": "gelu",
50
+ "feat_extract_dropout": 0.0,
51
+ "feat_extract_norm": "layer",
52
+ "feat_proj_dropout": 0.0,
53
+ "feat_quantizer_dropout": 0.0,
54
+ "final_dropout": 0.0,
55
+ "hidden_act": "gelu",
56
+ "hidden_dropout": 0.0,
57
+ "hidden_size": 1024,
58
+ "initializer_range": 0.02,
59
+ "intermediate_size": 4096,
60
+ "layer_norm_eps": 1e-05,
61
+ "layerdrop": 0.0,
62
+ "mask_feature_length": 64,
63
+ "mask_feature_min_masks": 0,
64
+ "mask_feature_prob": 0.25,
65
+ "mask_time_length": 10,
66
+ "mask_time_min_masks": 2,
67
+ "mask_time_prob": 0.75,
68
+ "model_type": "wav2vec2",
69
+ "num_adapter_layers": 3,
70
+ "num_attention_heads": 16,
71
+ "num_codevector_groups": 2,
72
+ "num_codevectors_per_group": 320,
73
+ "num_conv_pos_embedding_groups": 16,
74
+ "num_conv_pos_embeddings": 128,
75
+ "num_feat_extract_layers": 7,
76
+ "num_hidden_layers": 24,
77
+ "num_negatives": 100,
78
+ "output_hidden_size": 1024,
79
+ "pad_token_id": 4649,
80
+ "proj_codevector_dim": 768,
81
+ "tdnn_dilation": [
82
+ 1,
83
+ 2,
84
+ 3,
85
+ 1,
86
+ 1
87
+ ],
88
+ "tdnn_dim": [
89
+ 512,
90
+ 512,
91
+ 512,
92
+ 512,
93
+ 1500
94
+ ],
95
+ "tdnn_kernel": [
96
+ 5,
97
+ 3,
98
+ 3,
99
+ 1,
100
+ 1
101
+ ],
102
+ "torch_dtype": "float32",
103
+ "transformers_version": "4.17.0.dev0",
104
+ "use_weighted_layer_sum": false,
105
+ "vocab_size": 4652,
106
+ "xvector_output_dim": 512
107
+ }
checkpoint-32500/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:12fb441e6452947ea1655c7f7c842aff2a299dc45d448b7a216d33c0c2817f99
3
+ size 2528205329
checkpoint-32500/preprocessor_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": true,
8
+ "sampling_rate": 16000
9
+ }
checkpoint-32500/pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3cd6823aabe38955631845ceaec248c20d01321b5bd70f2c665a597692f680d3
3
+ size 1280996913
checkpoint-32500/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b7b2eb2a51b578035419f76ce97fd56f254fbffb12d8ddcefcb41ac1401484b0
3
+ size 14567
checkpoint-32500/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:abc763f34c98d95acfb42ffc40f6d4aa2cec7a1927cfa8e1c991f18aaa5785bb
3
+ size 559
checkpoint-32500/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b19fdca88fad3c20f4a17cedf3f05b06deb1c5d5c0d48bc89205f2835a20fa7b
3
+ size 623
checkpoint-32500/trainer_state.json ADDED
@@ -0,0 +1,2551 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 48.00590623846438,
5
+ "global_step": 32500,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 0.15,
12
+ "learning_rate": 3.6375e-06,
13
+ "loss": 124.9665,
14
+ "step": 100
15
+ },
16
+ {
17
+ "epoch": 0.3,
18
+ "learning_rate": 7.3875e-06,
19
+ "loss": 92.673,
20
+ "step": 200
21
+ },
22
+ {
23
+ "epoch": 0.44,
24
+ "learning_rate": 1.1099999999999999e-05,
25
+ "loss": 74.8932,
26
+ "step": 300
27
+ },
28
+ {
29
+ "epoch": 0.59,
30
+ "learning_rate": 1.485e-05,
31
+ "loss": 68.0432,
32
+ "step": 400
33
+ },
34
+ {
35
+ "epoch": 0.74,
36
+ "learning_rate": 1.8599999999999998e-05,
37
+ "loss": 60.2112,
38
+ "step": 500
39
+ },
40
+ {
41
+ "epoch": 0.74,
42
+ "eval_loss": 64.81886291503906,
43
+ "eval_runtime": 129.9516,
44
+ "eval_samples_per_second": 15.552,
45
+ "eval_steps_per_second": 1.947,
46
+ "eval_wer": 1.0,
47
+ "step": 500
48
+ },
49
+ {
50
+ "epoch": 0.89,
51
+ "learning_rate": 2.2349999999999998e-05,
52
+ "loss": 51.3096,
53
+ "step": 600
54
+ },
55
+ {
56
+ "epoch": 1.03,
57
+ "learning_rate": 2.6099999999999997e-05,
58
+ "loss": 39.1106,
59
+ "step": 700
60
+ },
61
+ {
62
+ "epoch": 1.18,
63
+ "learning_rate": 2.985e-05,
64
+ "loss": 26.6843,
65
+ "step": 800
66
+ },
67
+ {
68
+ "epoch": 1.33,
69
+ "learning_rate": 3.36e-05,
70
+ "loss": 14.7864,
71
+ "step": 900
72
+ },
73
+ {
74
+ "epoch": 1.48,
75
+ "learning_rate": 3.735e-05,
76
+ "loss": 8.1128,
77
+ "step": 1000
78
+ },
79
+ {
80
+ "epoch": 1.48,
81
+ "eval_loss": 6.899676322937012,
82
+ "eval_runtime": 115.5788,
83
+ "eval_samples_per_second": 17.486,
84
+ "eval_steps_per_second": 2.189,
85
+ "eval_wer": 1.0,
86
+ "step": 1000
87
+ },
88
+ {
89
+ "epoch": 1.62,
90
+ "learning_rate": 4.11e-05,
91
+ "loss": 6.6068,
92
+ "step": 1100
93
+ },
94
+ {
95
+ "epoch": 1.77,
96
+ "learning_rate": 4.484999999999999e-05,
97
+ "loss": 6.23,
98
+ "step": 1200
99
+ },
100
+ {
101
+ "epoch": 1.92,
102
+ "learning_rate": 4.8599999999999995e-05,
103
+ "loss": 6.0972,
104
+ "step": 1300
105
+ },
106
+ {
107
+ "epoch": 2.07,
108
+ "learning_rate": 5.234999999999999e-05,
109
+ "loss": 6.0595,
110
+ "step": 1400
111
+ },
112
+ {
113
+ "epoch": 2.22,
114
+ "learning_rate": 5.6099999999999995e-05,
115
+ "loss": 6.0492,
116
+ "step": 1500
117
+ },
118
+ {
119
+ "epoch": 2.22,
120
+ "eval_loss": 5.967654228210449,
121
+ "eval_runtime": 115.432,
122
+ "eval_samples_per_second": 17.508,
123
+ "eval_steps_per_second": 2.192,
124
+ "eval_wer": 1.949529935675408,
125
+ "step": 1500
126
+ },
127
+ {
128
+ "epoch": 2.36,
129
+ "learning_rate": 5.985e-05,
130
+ "loss": 6.0266,
131
+ "step": 1600
132
+ },
133
+ {
134
+ "epoch": 2.51,
135
+ "learning_rate": 6.359999999999999e-05,
136
+ "loss": 5.9902,
137
+ "step": 1700
138
+ },
139
+ {
140
+ "epoch": 2.66,
141
+ "learning_rate": 6.735e-05,
142
+ "loss": 5.9762,
143
+ "step": 1800
144
+ },
145
+ {
146
+ "epoch": 2.81,
147
+ "learning_rate": 7.11e-05,
148
+ "loss": 5.9491,
149
+ "step": 1900
150
+ },
151
+ {
152
+ "epoch": 2.95,
153
+ "learning_rate": 7.484999999999999e-05,
154
+ "loss": 5.9326,
155
+ "step": 2000
156
+ },
157
+ {
158
+ "epoch": 2.95,
159
+ "eval_loss": 5.884542942047119,
160
+ "eval_runtime": 114.597,
161
+ "eval_samples_per_second": 17.636,
162
+ "eval_steps_per_second": 2.208,
163
+ "eval_wer": 1.409203364670955,
164
+ "step": 2000
165
+ },
166
+ {
167
+ "epoch": 3.1,
168
+ "learning_rate": 7.477394034536891e-05,
169
+ "loss": 5.9356,
170
+ "step": 2100
171
+ },
172
+ {
173
+ "epoch": 3.25,
174
+ "learning_rate": 7.453846153846153e-05,
175
+ "loss": 5.8889,
176
+ "step": 2200
177
+ },
178
+ {
179
+ "epoch": 3.4,
180
+ "learning_rate": 7.430298273155415e-05,
181
+ "loss": 5.899,
182
+ "step": 2300
183
+ },
184
+ {
185
+ "epoch": 3.54,
186
+ "learning_rate": 7.406750392464678e-05,
187
+ "loss": 5.8824,
188
+ "step": 2400
189
+ },
190
+ {
191
+ "epoch": 3.69,
192
+ "learning_rate": 7.38320251177394e-05,
193
+ "loss": 5.8763,
194
+ "step": 2500
195
+ },
196
+ {
197
+ "epoch": 3.69,
198
+ "eval_loss": 5.846009731292725,
199
+ "eval_runtime": 117.5393,
200
+ "eval_samples_per_second": 17.194,
201
+ "eval_steps_per_second": 2.152,
202
+ "eval_wer": 1.6125680356259278,
203
+ "step": 2500
204
+ },
205
+ {
206
+ "epoch": 3.84,
207
+ "learning_rate": 7.359654631083201e-05,
208
+ "loss": 5.875,
209
+ "step": 2600
210
+ },
211
+ {
212
+ "epoch": 3.99,
213
+ "learning_rate": 7.336106750392464e-05,
214
+ "loss": 5.8671,
215
+ "step": 2700
216
+ },
217
+ {
218
+ "epoch": 4.14,
219
+ "learning_rate": 7.312558869701726e-05,
220
+ "loss": 5.8591,
221
+ "step": 2800
222
+ },
223
+ {
224
+ "epoch": 4.28,
225
+ "learning_rate": 7.289010989010989e-05,
226
+ "loss": 5.8226,
227
+ "step": 2900
228
+ },
229
+ {
230
+ "epoch": 4.43,
231
+ "learning_rate": 7.265463108320251e-05,
232
+ "loss": 5.7888,
233
+ "step": 3000
234
+ },
235
+ {
236
+ "epoch": 4.43,
237
+ "eval_loss": 5.75445032119751,
238
+ "eval_runtime": 114.1832,
239
+ "eval_samples_per_second": 17.7,
240
+ "eval_steps_per_second": 2.216,
241
+ "eval_wer": 2.2033646709549726,
242
+ "step": 3000
243
+ },
244
+ {
245
+ "epoch": 4.58,
246
+ "learning_rate": 7.241915227629513e-05,
247
+ "loss": 5.8041,
248
+ "step": 3100
249
+ },
250
+ {
251
+ "epoch": 4.73,
252
+ "learning_rate": 7.218367346938774e-05,
253
+ "loss": 5.8013,
254
+ "step": 3200
255
+ },
256
+ {
257
+ "epoch": 4.87,
258
+ "learning_rate": 7.194819466248037e-05,
259
+ "loss": 5.7947,
260
+ "step": 3300
261
+ },
262
+ {
263
+ "epoch": 5.02,
264
+ "learning_rate": 7.171271585557299e-05,
265
+ "loss": 5.7802,
266
+ "step": 3400
267
+ },
268
+ {
269
+ "epoch": 5.17,
270
+ "learning_rate": 7.147723704866562e-05,
271
+ "loss": 5.735,
272
+ "step": 3500
273
+ },
274
+ {
275
+ "epoch": 5.17,
276
+ "eval_loss": 5.677657604217529,
277
+ "eval_runtime": 115.6516,
278
+ "eval_samples_per_second": 17.475,
279
+ "eval_steps_per_second": 2.188,
280
+ "eval_wer": 2.334982681840673,
281
+ "step": 3500
282
+ },
283
+ {
284
+ "epoch": 5.32,
285
+ "learning_rate": 7.124175824175823e-05,
286
+ "loss": 5.7198,
287
+ "step": 3600
288
+ },
289
+ {
290
+ "epoch": 5.47,
291
+ "learning_rate": 7.100627943485086e-05,
292
+ "loss": 5.7092,
293
+ "step": 3700
294
+ },
295
+ {
296
+ "epoch": 5.61,
297
+ "learning_rate": 7.077080062794347e-05,
298
+ "loss": 5.6613,
299
+ "step": 3800
300
+ },
301
+ {
302
+ "epoch": 5.76,
303
+ "learning_rate": 7.05353218210361e-05,
304
+ "loss": 5.6579,
305
+ "step": 3900
306
+ },
307
+ {
308
+ "epoch": 5.91,
309
+ "learning_rate": 7.029984301412873e-05,
310
+ "loss": 5.6861,
311
+ "step": 4000
312
+ },
313
+ {
314
+ "epoch": 5.91,
315
+ "eval_loss": 5.517865180969238,
316
+ "eval_runtime": 115.3653,
317
+ "eval_samples_per_second": 17.518,
318
+ "eval_steps_per_second": 2.193,
319
+ "eval_wer": 2.223156853043048,
320
+ "step": 4000
321
+ },
322
+ {
323
+ "epoch": 6.06,
324
+ "learning_rate": 7.006436420722135e-05,
325
+ "loss": 5.6024,
326
+ "step": 4100
327
+ },
328
+ {
329
+ "epoch": 6.2,
330
+ "learning_rate": 6.982888540031396e-05,
331
+ "loss": 5.5497,
332
+ "step": 4200
333
+ },
334
+ {
335
+ "epoch": 6.35,
336
+ "learning_rate": 6.959340659340659e-05,
337
+ "loss": 5.5257,
338
+ "step": 4300
339
+ },
340
+ {
341
+ "epoch": 6.5,
342
+ "learning_rate": 6.93579277864992e-05,
343
+ "loss": 5.4534,
344
+ "step": 4400
345
+ },
346
+ {
347
+ "epoch": 6.65,
348
+ "learning_rate": 6.912244897959182e-05,
349
+ "loss": 5.381,
350
+ "step": 4500
351
+ },
352
+ {
353
+ "epoch": 6.65,
354
+ "eval_loss": 5.142032146453857,
355
+ "eval_runtime": 117.6237,
356
+ "eval_samples_per_second": 17.182,
357
+ "eval_steps_per_second": 2.151,
358
+ "eval_wer": 2.18159327065809,
359
+ "step": 4500
360
+ },
361
+ {
362
+ "epoch": 6.79,
363
+ "learning_rate": 6.888697017268445e-05,
364
+ "loss": 5.3409,
365
+ "step": 4600
366
+ },
367
+ {
368
+ "epoch": 6.94,
369
+ "learning_rate": 6.865149136577708e-05,
370
+ "loss": 5.1283,
371
+ "step": 4700
372
+ },
373
+ {
374
+ "epoch": 7.09,
375
+ "learning_rate": 6.841601255886969e-05,
376
+ "loss": 4.8788,
377
+ "step": 4800
378
+ },
379
+ {
380
+ "epoch": 7.24,
381
+ "learning_rate": 6.818053375196232e-05,
382
+ "loss": 4.7235,
383
+ "step": 4900
384
+ },
385
+ {
386
+ "epoch": 7.39,
387
+ "learning_rate": 6.794505494505494e-05,
388
+ "loss": 4.625,
389
+ "step": 5000
390
+ },
391
+ {
392
+ "epoch": 7.39,
393
+ "eval_loss": 3.9019837379455566,
394
+ "eval_runtime": 116.0971,
395
+ "eval_samples_per_second": 17.408,
396
+ "eval_steps_per_second": 2.179,
397
+ "eval_wer": 2.0722414646214746,
398
+ "step": 5000
399
+ },
400
+ {
401
+ "epoch": 7.53,
402
+ "learning_rate": 6.770957613814756e-05,
403
+ "loss": 4.5404,
404
+ "step": 5100
405
+ },
406
+ {
407
+ "epoch": 7.68,
408
+ "learning_rate": 6.747409733124018e-05,
409
+ "loss": 4.4307,
410
+ "step": 5200
411
+ },
412
+ {
413
+ "epoch": 7.83,
414
+ "learning_rate": 6.723861852433281e-05,
415
+ "loss": 4.3794,
416
+ "step": 5300
417
+ },
418
+ {
419
+ "epoch": 7.98,
420
+ "learning_rate": 6.700313971742542e-05,
421
+ "loss": 4.2786,
422
+ "step": 5400
423
+ },
424
+ {
425
+ "epoch": 8.12,
426
+ "learning_rate": 6.676766091051805e-05,
427
+ "loss": 4.214,
428
+ "step": 5500
429
+ },
430
+ {
431
+ "epoch": 8.12,
432
+ "eval_loss": 3.339416027069092,
433
+ "eval_runtime": 116.9868,
434
+ "eval_samples_per_second": 17.275,
435
+ "eval_steps_per_second": 2.163,
436
+ "eval_wer": 2.1429985155863434,
437
+ "step": 5500
438
+ },
439
+ {
440
+ "epoch": 8.27,
441
+ "learning_rate": 6.653218210361068e-05,
442
+ "loss": 4.1206,
443
+ "step": 5600
444
+ },
445
+ {
446
+ "epoch": 8.42,
447
+ "learning_rate": 6.62967032967033e-05,
448
+ "loss": 4.081,
449
+ "step": 5700
450
+ },
451
+ {
452
+ "epoch": 8.57,
453
+ "learning_rate": 6.606122448979591e-05,
454
+ "loss": 4.0059,
455
+ "step": 5800
456
+ },
457
+ {
458
+ "epoch": 8.71,
459
+ "learning_rate": 6.582574568288854e-05,
460
+ "loss": 3.9251,
461
+ "step": 5900
462
+ },
463
+ {
464
+ "epoch": 8.86,
465
+ "learning_rate": 6.559262166405023e-05,
466
+ "loss": 3.8992,
467
+ "step": 6000
468
+ },
469
+ {
470
+ "epoch": 8.86,
471
+ "eval_loss": 2.9084665775299072,
472
+ "eval_runtime": 119.0907,
473
+ "eval_samples_per_second": 16.97,
474
+ "eval_steps_per_second": 2.124,
475
+ "eval_wer": 2.153389411182583,
476
+ "step": 6000
477
+ },
478
+ {
479
+ "epoch": 9.01,
480
+ "learning_rate": 6.535714285714285e-05,
481
+ "loss": 3.8494,
482
+ "step": 6100
483
+ },
484
+ {
485
+ "epoch": 9.16,
486
+ "learning_rate": 6.512166405023547e-05,
487
+ "loss": 3.7923,
488
+ "step": 6200
489
+ },
490
+ {
491
+ "epoch": 9.31,
492
+ "learning_rate": 6.48861852433281e-05,
493
+ "loss": 3.7416,
494
+ "step": 6300
495
+ },
496
+ {
497
+ "epoch": 9.45,
498
+ "learning_rate": 6.465070643642071e-05,
499
+ "loss": 3.7095,
500
+ "step": 6400
501
+ },
502
+ {
503
+ "epoch": 9.6,
504
+ "learning_rate": 6.441522762951334e-05,
505
+ "loss": 3.6481,
506
+ "step": 6500
507
+ },
508
+ {
509
+ "epoch": 9.6,
510
+ "eval_loss": 2.620758295059204,
511
+ "eval_runtime": 115.2407,
512
+ "eval_samples_per_second": 17.537,
513
+ "eval_steps_per_second": 2.195,
514
+ "eval_wer": 2.3537852548243445,
515
+ "step": 6500
516
+ },
517
+ {
518
+ "epoch": 9.75,
519
+ "learning_rate": 6.417974882260596e-05,
520
+ "loss": 3.6196,
521
+ "step": 6600
522
+ },
523
+ {
524
+ "epoch": 9.9,
525
+ "learning_rate": 6.394427001569859e-05,
526
+ "loss": 3.5941,
527
+ "step": 6700
528
+ },
529
+ {
530
+ "epoch": 10.04,
531
+ "learning_rate": 6.37087912087912e-05,
532
+ "loss": 3.5608,
533
+ "step": 6800
534
+ },
535
+ {
536
+ "epoch": 10.19,
537
+ "learning_rate": 6.347331240188383e-05,
538
+ "loss": 3.5296,
539
+ "step": 6900
540
+ },
541
+ {
542
+ "epoch": 10.34,
543
+ "learning_rate": 6.324018838304552e-05,
544
+ "loss": 3.4658,
545
+ "step": 7000
546
+ },
547
+ {
548
+ "epoch": 10.34,
549
+ "eval_loss": 2.3172152042388916,
550
+ "eval_runtime": 114.5436,
551
+ "eval_samples_per_second": 17.644,
552
+ "eval_steps_per_second": 2.209,
553
+ "eval_wer": 2.227115289460663,
554
+ "step": 7000
555
+ },
556
+ {
557
+ "epoch": 10.49,
558
+ "learning_rate": 6.300470957613814e-05,
559
+ "loss": 3.3977,
560
+ "step": 7100
561
+ },
562
+ {
563
+ "epoch": 10.63,
564
+ "learning_rate": 6.276923076923076e-05,
565
+ "loss": 3.3987,
566
+ "step": 7200
567
+ },
568
+ {
569
+ "epoch": 10.78,
570
+ "learning_rate": 6.253375196232339e-05,
571
+ "loss": 3.3587,
572
+ "step": 7300
573
+ },
574
+ {
575
+ "epoch": 10.93,
576
+ "learning_rate": 6.2298273155416e-05,
577
+ "loss": 3.2796,
578
+ "step": 7400
579
+ },
580
+ {
581
+ "epoch": 11.08,
582
+ "learning_rate": 6.206279434850863e-05,
583
+ "loss": 3.257,
584
+ "step": 7500
585
+ },
586
+ {
587
+ "epoch": 11.08,
588
+ "eval_loss": 2.0916049480438232,
589
+ "eval_runtime": 113.9408,
590
+ "eval_samples_per_second": 17.737,
591
+ "eval_steps_per_second": 2.22,
592
+ "eval_wer": 2.1350816427511132,
593
+ "step": 7500
594
+ },
595
+ {
596
+ "epoch": 11.23,
597
+ "learning_rate": 6.182731554160125e-05,
598
+ "loss": 3.2476,
599
+ "step": 7600
600
+ },
601
+ {
602
+ "epoch": 11.37,
603
+ "learning_rate": 6.159183673469388e-05,
604
+ "loss": 3.2463,
605
+ "step": 7700
606
+ },
607
+ {
608
+ "epoch": 11.52,
609
+ "learning_rate": 6.135635792778649e-05,
610
+ "loss": 3.2323,
611
+ "step": 7800
612
+ },
613
+ {
614
+ "epoch": 11.67,
615
+ "learning_rate": 6.112087912087912e-05,
616
+ "loss": 3.1674,
617
+ "step": 7900
618
+ },
619
+ {
620
+ "epoch": 11.82,
621
+ "learning_rate": 6.088540031397174e-05,
622
+ "loss": 3.1294,
623
+ "step": 8000
624
+ },
625
+ {
626
+ "epoch": 11.82,
627
+ "eval_loss": 1.895378828048706,
628
+ "eval_runtime": 115.1394,
629
+ "eval_samples_per_second": 17.553,
630
+ "eval_steps_per_second": 2.197,
631
+ "eval_wer": 2.2132607619990106,
632
+ "step": 8000
633
+ },
634
+ {
635
+ "epoch": 11.96,
636
+ "learning_rate": 6.0649921507064355e-05,
637
+ "loss": 3.1262,
638
+ "step": 8100
639
+ },
640
+ {
641
+ "epoch": 12.11,
642
+ "learning_rate": 6.041444270015698e-05,
643
+ "loss": 3.0377,
644
+ "step": 8200
645
+ },
646
+ {
647
+ "epoch": 12.26,
648
+ "learning_rate": 6.01789638932496e-05,
649
+ "loss": 3.0306,
650
+ "step": 8300
651
+ },
652
+ {
653
+ "epoch": 12.41,
654
+ "learning_rate": 5.994348508634223e-05,
655
+ "loss": 3.0425,
656
+ "step": 8400
657
+ },
658
+ {
659
+ "epoch": 12.56,
660
+ "learning_rate": 5.9710361067503915e-05,
661
+ "loss": 3.0266,
662
+ "step": 8500
663
+ },
664
+ {
665
+ "epoch": 12.56,
666
+ "eval_loss": 1.76727294921875,
667
+ "eval_runtime": 114.3494,
668
+ "eval_samples_per_second": 17.674,
669
+ "eval_steps_per_second": 2.213,
670
+ "eval_wer": 2.0895596239485403,
671
+ "step": 8500
672
+ },
673
+ {
674
+ "epoch": 12.7,
675
+ "learning_rate": 5.9474882260596537e-05,
676
+ "loss": 3.0398,
677
+ "step": 8600
678
+ },
679
+ {
680
+ "epoch": 12.85,
681
+ "learning_rate": 5.9239403453689165e-05,
682
+ "loss": 2.9985,
683
+ "step": 8700
684
+ },
685
+ {
686
+ "epoch": 13.0,
687
+ "learning_rate": 5.900392464678179e-05,
688
+ "loss": 2.9969,
689
+ "step": 8800
690
+ },
691
+ {
692
+ "epoch": 13.15,
693
+ "learning_rate": 5.876844583987441e-05,
694
+ "loss": 2.9648,
695
+ "step": 8900
696
+ },
697
+ {
698
+ "epoch": 13.29,
699
+ "learning_rate": 5.8532967032967024e-05,
700
+ "loss": 2.9451,
701
+ "step": 9000
702
+ },
703
+ {
704
+ "epoch": 13.29,
705
+ "eval_loss": 1.665855884552002,
706
+ "eval_runtime": 116.4877,
707
+ "eval_samples_per_second": 17.349,
708
+ "eval_steps_per_second": 2.172,
709
+ "eval_wer": 2.1380504700643246,
710
+ "step": 9000
711
+ },
712
+ {
713
+ "epoch": 13.44,
714
+ "learning_rate": 5.8297488226059645e-05,
715
+ "loss": 2.9573,
716
+ "step": 9100
717
+ },
718
+ {
719
+ "epoch": 13.59,
720
+ "learning_rate": 5.8062009419152274e-05,
721
+ "loss": 2.8819,
722
+ "step": 9200
723
+ },
724
+ {
725
+ "epoch": 13.74,
726
+ "learning_rate": 5.7826530612244896e-05,
727
+ "loss": 2.8901,
728
+ "step": 9300
729
+ },
730
+ {
731
+ "epoch": 13.88,
732
+ "learning_rate": 5.759105180533752e-05,
733
+ "loss": 2.8492,
734
+ "step": 9400
735
+ },
736
+ {
737
+ "epoch": 14.03,
738
+ "learning_rate": 5.735557299843013e-05,
739
+ "loss": 2.8802,
740
+ "step": 9500
741
+ },
742
+ {
743
+ "epoch": 14.03,
744
+ "eval_loss": 1.5637215375900269,
745
+ "eval_runtime": 115.1622,
746
+ "eval_samples_per_second": 17.549,
747
+ "eval_steps_per_second": 2.197,
748
+ "eval_wer": 2.1969322117763483,
749
+ "step": 9500
750
+ },
751
+ {
752
+ "epoch": 14.18,
753
+ "learning_rate": 5.7120094191522754e-05,
754
+ "loss": 2.8346,
755
+ "step": 9600
756
+ },
757
+ {
758
+ "epoch": 14.33,
759
+ "learning_rate": 5.6884615384615376e-05,
760
+ "loss": 2.8355,
761
+ "step": 9700
762
+ },
763
+ {
764
+ "epoch": 14.48,
765
+ "learning_rate": 5.6649136577708005e-05,
766
+ "loss": 2.8124,
767
+ "step": 9800
768
+ },
769
+ {
770
+ "epoch": 14.62,
771
+ "learning_rate": 5.6413657770800626e-05,
772
+ "loss": 2.7879,
773
+ "step": 9900
774
+ },
775
+ {
776
+ "epoch": 14.77,
777
+ "learning_rate": 5.617817896389324e-05,
778
+ "loss": 2.78,
779
+ "step": 10000
780
+ },
781
+ {
782
+ "epoch": 14.77,
783
+ "eval_loss": 1.4921427965164185,
784
+ "eval_runtime": 115.1,
785
+ "eval_samples_per_second": 17.559,
786
+ "eval_steps_per_second": 2.198,
787
+ "eval_wer": 2.2335477486392876,
788
+ "step": 10000
789
+ },
790
+ {
791
+ "epoch": 14.92,
792
+ "learning_rate": 5.594270015698586e-05,
793
+ "loss": 2.775,
794
+ "step": 10100
795
+ },
796
+ {
797
+ "epoch": 15.07,
798
+ "learning_rate": 5.5707221350078485e-05,
799
+ "loss": 2.7478,
800
+ "step": 10200
801
+ },
802
+ {
803
+ "epoch": 15.21,
804
+ "learning_rate": 5.5471742543171114e-05,
805
+ "loss": 2.7224,
806
+ "step": 10300
807
+ },
808
+ {
809
+ "epoch": 15.36,
810
+ "learning_rate": 5.5236263736263735e-05,
811
+ "loss": 2.7506,
812
+ "step": 10400
813
+ },
814
+ {
815
+ "epoch": 15.51,
816
+ "learning_rate": 5.500078492935635e-05,
817
+ "loss": 2.7049,
818
+ "step": 10500
819
+ },
820
+ {
821
+ "epoch": 15.51,
822
+ "eval_loss": 1.413183569908142,
823
+ "eval_runtime": 114.2743,
824
+ "eval_samples_per_second": 17.686,
825
+ "eval_steps_per_second": 2.214,
826
+ "eval_wer": 2.221672439386442,
827
+ "step": 10500
828
+ },
829
+ {
830
+ "epoch": 15.66,
831
+ "learning_rate": 5.476766091051805e-05,
832
+ "loss": 2.7145,
833
+ "step": 10600
834
+ },
835
+ {
836
+ "epoch": 15.8,
837
+ "learning_rate": 5.453218210361067e-05,
838
+ "loss": 2.6892,
839
+ "step": 10700
840
+ },
841
+ {
842
+ "epoch": 15.95,
843
+ "learning_rate": 5.4296703296703295e-05,
844
+ "loss": 2.69,
845
+ "step": 10800
846
+ },
847
+ {
848
+ "epoch": 16.1,
849
+ "learning_rate": 5.406122448979591e-05,
850
+ "loss": 2.623,
851
+ "step": 10900
852
+ },
853
+ {
854
+ "epoch": 16.25,
855
+ "learning_rate": 5.382574568288853e-05,
856
+ "loss": 2.6768,
857
+ "step": 11000
858
+ },
859
+ {
860
+ "epoch": 16.25,
861
+ "eval_loss": 1.3666878938674927,
862
+ "eval_runtime": 119.4402,
863
+ "eval_samples_per_second": 16.921,
864
+ "eval_steps_per_second": 2.118,
865
+ "eval_wer": 2.223156853043048,
866
+ "step": 11000
867
+ },
868
+ {
869
+ "epoch": 16.4,
870
+ "learning_rate": 5.359262166405023e-05,
871
+ "loss": 2.628,
872
+ "step": 11100
873
+ },
874
+ {
875
+ "epoch": 16.54,
876
+ "learning_rate": 5.3357142857142854e-05,
877
+ "loss": 2.6163,
878
+ "step": 11200
879
+ },
880
+ {
881
+ "epoch": 16.69,
882
+ "learning_rate": 5.312166405023547e-05,
883
+ "loss": 2.6193,
884
+ "step": 11300
885
+ },
886
+ {
887
+ "epoch": 16.84,
888
+ "learning_rate": 5.28861852433281e-05,
889
+ "loss": 2.6531,
890
+ "step": 11400
891
+ },
892
+ {
893
+ "epoch": 16.99,
894
+ "learning_rate": 5.265070643642072e-05,
895
+ "loss": 2.6358,
896
+ "step": 11500
897
+ },
898
+ {
899
+ "epoch": 16.99,
900
+ "eval_loss": 1.311090111732483,
901
+ "eval_runtime": 116.2157,
902
+ "eval_samples_per_second": 17.39,
903
+ "eval_steps_per_second": 2.177,
904
+ "eval_wer": 2.128649183572489,
905
+ "step": 11500
906
+ },
907
+ {
908
+ "epoch": 17.13,
909
+ "learning_rate": 5.241522762951334e-05,
910
+ "loss": 2.5748,
911
+ "step": 11600
912
+ },
913
+ {
914
+ "epoch": 17.28,
915
+ "learning_rate": 5.217974882260596e-05,
916
+ "loss": 2.6287,
917
+ "step": 11700
918
+ },
919
+ {
920
+ "epoch": 17.43,
921
+ "learning_rate": 5.194427001569858e-05,
922
+ "loss": 2.5583,
923
+ "step": 11800
924
+ },
925
+ {
926
+ "epoch": 17.58,
927
+ "learning_rate": 5.17087912087912e-05,
928
+ "loss": 2.5547,
929
+ "step": 11900
930
+ },
931
+ {
932
+ "epoch": 17.72,
933
+ "learning_rate": 5.147331240188383e-05,
934
+ "loss": 2.5802,
935
+ "step": 12000
936
+ },
937
+ {
938
+ "epoch": 17.72,
939
+ "eval_loss": 1.2678567171096802,
940
+ "eval_runtime": 116.076,
941
+ "eval_samples_per_second": 17.411,
942
+ "eval_steps_per_second": 2.18,
943
+ "eval_wer": 2.1429985155863434,
944
+ "step": 12000
945
+ },
946
+ {
947
+ "epoch": 17.87,
948
+ "learning_rate": 5.123783359497645e-05,
949
+ "loss": 2.557,
950
+ "step": 12100
951
+ },
952
+ {
953
+ "epoch": 18.02,
954
+ "learning_rate": 5.100235478806907e-05,
955
+ "loss": 2.5771,
956
+ "step": 12200
957
+ },
958
+ {
959
+ "epoch": 18.17,
960
+ "learning_rate": 5.076687598116169e-05,
961
+ "loss": 2.5393,
962
+ "step": 12300
963
+ },
964
+ {
965
+ "epoch": 18.32,
966
+ "learning_rate": 5.053375196232339e-05,
967
+ "loss": 2.5031,
968
+ "step": 12400
969
+ },
970
+ {
971
+ "epoch": 18.46,
972
+ "learning_rate": 5.029827315541601e-05,
973
+ "loss": 2.5012,
974
+ "step": 12500
975
+ },
976
+ {
977
+ "epoch": 18.46,
978
+ "eval_loss": 1.2365446090698242,
979
+ "eval_runtime": 116.0118,
980
+ "eval_samples_per_second": 17.421,
981
+ "eval_steps_per_second": 2.181,
982
+ "eval_wer": 2.115289460663038,
983
+ "step": 12500
984
+ },
985
+ {
986
+ "epoch": 18.61,
987
+ "learning_rate": 5.006279434850863e-05,
988
+ "loss": 2.54,
989
+ "step": 12600
990
+ },
991
+ {
992
+ "epoch": 18.76,
993
+ "learning_rate": 4.9827315541601246e-05,
994
+ "loss": 2.5072,
995
+ "step": 12700
996
+ },
997
+ {
998
+ "epoch": 18.91,
999
+ "learning_rate": 4.9591836734693875e-05,
1000
+ "loss": 2.4951,
1001
+ "step": 12800
1002
+ },
1003
+ {
1004
+ "epoch": 19.05,
1005
+ "learning_rate": 4.9356357927786497e-05,
1006
+ "loss": 2.4789,
1007
+ "step": 12900
1008
+ },
1009
+ {
1010
+ "epoch": 19.2,
1011
+ "learning_rate": 4.912087912087912e-05,
1012
+ "loss": 2.458,
1013
+ "step": 13000
1014
+ },
1015
+ {
1016
+ "epoch": 19.2,
1017
+ "eval_loss": 1.2117862701416016,
1018
+ "eval_runtime": 116.2579,
1019
+ "eval_samples_per_second": 17.384,
1020
+ "eval_steps_per_second": 2.176,
1021
+ "eval_wer": 2.1573478476001977,
1022
+ "step": 13000
1023
+ },
1024
+ {
1025
+ "epoch": 19.35,
1026
+ "learning_rate": 4.888540031397174e-05,
1027
+ "loss": 2.4616,
1028
+ "step": 13100
1029
+ },
1030
+ {
1031
+ "epoch": 19.5,
1032
+ "learning_rate": 4.8649921507064355e-05,
1033
+ "loss": 2.4739,
1034
+ "step": 13200
1035
+ },
1036
+ {
1037
+ "epoch": 19.65,
1038
+ "learning_rate": 4.8414442700156984e-05,
1039
+ "loss": 2.4867,
1040
+ "step": 13300
1041
+ },
1042
+ {
1043
+ "epoch": 19.79,
1044
+ "learning_rate": 4.8178963893249605e-05,
1045
+ "loss": 2.4568,
1046
+ "step": 13400
1047
+ },
1048
+ {
1049
+ "epoch": 19.94,
1050
+ "learning_rate": 4.794348508634223e-05,
1051
+ "loss": 2.4433,
1052
+ "step": 13500
1053
+ },
1054
+ {
1055
+ "epoch": 19.94,
1056
+ "eval_loss": 1.1991767883300781,
1057
+ "eval_runtime": 114.5641,
1058
+ "eval_samples_per_second": 17.641,
1059
+ "eval_steps_per_second": 2.208,
1060
+ "eval_wer": 2.1335972290945078,
1061
+ "step": 13500
1062
+ },
1063
+ {
1064
+ "epoch": 20.09,
1065
+ "learning_rate": 4.770800627943485e-05,
1066
+ "loss": 2.4532,
1067
+ "step": 13600
1068
+ },
1069
+ {
1070
+ "epoch": 20.24,
1071
+ "learning_rate": 4.7472527472527464e-05,
1072
+ "loss": 2.3913,
1073
+ "step": 13700
1074
+ },
1075
+ {
1076
+ "epoch": 20.38,
1077
+ "learning_rate": 4.7237048665620086e-05,
1078
+ "loss": 2.421,
1079
+ "step": 13800
1080
+ },
1081
+ {
1082
+ "epoch": 20.53,
1083
+ "learning_rate": 4.7001569858712714e-05,
1084
+ "loss": 2.4526,
1085
+ "step": 13900
1086
+ },
1087
+ {
1088
+ "epoch": 20.68,
1089
+ "learning_rate": 4.6766091051805336e-05,
1090
+ "loss": 2.438,
1091
+ "step": 14000
1092
+ },
1093
+ {
1094
+ "epoch": 20.68,
1095
+ "eval_loss": 1.180332064628601,
1096
+ "eval_runtime": 116.5012,
1097
+ "eval_samples_per_second": 17.347,
1098
+ "eval_steps_per_second": 2.172,
1099
+ "eval_wer": 2.1509153884215735,
1100
+ "step": 14000
1101
+ },
1102
+ {
1103
+ "epoch": 20.83,
1104
+ "learning_rate": 4.653061224489796e-05,
1105
+ "loss": 2.4034,
1106
+ "step": 14100
1107
+ },
1108
+ {
1109
+ "epoch": 20.97,
1110
+ "learning_rate": 4.629513343799057e-05,
1111
+ "loss": 2.4306,
1112
+ "step": 14200
1113
+ },
1114
+ {
1115
+ "epoch": 21.12,
1116
+ "learning_rate": 4.6059654631083195e-05,
1117
+ "loss": 2.4145,
1118
+ "step": 14300
1119
+ },
1120
+ {
1121
+ "epoch": 21.27,
1122
+ "learning_rate": 4.582417582417582e-05,
1123
+ "loss": 2.4677,
1124
+ "step": 14400
1125
+ },
1126
+ {
1127
+ "epoch": 21.42,
1128
+ "learning_rate": 4.5588697017268445e-05,
1129
+ "loss": 2.418,
1130
+ "step": 14500
1131
+ },
1132
+ {
1133
+ "epoch": 21.42,
1134
+ "eval_loss": 1.1601430177688599,
1135
+ "eval_runtime": 114.5652,
1136
+ "eval_samples_per_second": 17.641,
1137
+ "eval_steps_per_second": 2.208,
1138
+ "eval_wer": 2.1232063334982683,
1139
+ "step": 14500
1140
+ },
1141
+ {
1142
+ "epoch": 21.57,
1143
+ "learning_rate": 4.535321821036107e-05,
1144
+ "loss": 2.3967,
1145
+ "step": 14600
1146
+ },
1147
+ {
1148
+ "epoch": 21.71,
1149
+ "learning_rate": 4.511773940345368e-05,
1150
+ "loss": 2.3939,
1151
+ "step": 14700
1152
+ },
1153
+ {
1154
+ "epoch": 21.86,
1155
+ "learning_rate": 4.4882260596546304e-05,
1156
+ "loss": 2.3925,
1157
+ "step": 14800
1158
+ },
1159
+ {
1160
+ "epoch": 22.01,
1161
+ "learning_rate": 4.4646781789638925e-05,
1162
+ "loss": 2.3596,
1163
+ "step": 14900
1164
+ },
1165
+ {
1166
+ "epoch": 22.16,
1167
+ "learning_rate": 4.4411302982731554e-05,
1168
+ "loss": 2.3322,
1169
+ "step": 15000
1170
+ },
1171
+ {
1172
+ "epoch": 22.16,
1173
+ "eval_loss": 1.1417704820632935,
1174
+ "eval_runtime": 116.2111,
1175
+ "eval_samples_per_second": 17.391,
1176
+ "eval_steps_per_second": 2.177,
1177
+ "eval_wer": 2.1929737753587335,
1178
+ "step": 15000
1179
+ },
1180
+ {
1181
+ "epoch": 22.3,
1182
+ "learning_rate": 4.4175824175824176e-05,
1183
+ "loss": 2.3821,
1184
+ "step": 15100
1185
+ },
1186
+ {
1187
+ "epoch": 22.45,
1188
+ "learning_rate": 4.394034536891679e-05,
1189
+ "loss": 2.3435,
1190
+ "step": 15200
1191
+ },
1192
+ {
1193
+ "epoch": 22.6,
1194
+ "learning_rate": 4.370486656200941e-05,
1195
+ "loss": 2.3542,
1196
+ "step": 15300
1197
+ },
1198
+ {
1199
+ "epoch": 22.75,
1200
+ "learning_rate": 4.3469387755102034e-05,
1201
+ "loss": 2.3469,
1202
+ "step": 15400
1203
+ },
1204
+ {
1205
+ "epoch": 22.89,
1206
+ "learning_rate": 4.323390894819466e-05,
1207
+ "loss": 2.3387,
1208
+ "step": 15500
1209
+ },
1210
+ {
1211
+ "epoch": 22.89,
1212
+ "eval_loss": 1.1172302961349487,
1213
+ "eval_runtime": 114.3169,
1214
+ "eval_samples_per_second": 17.679,
1215
+ "eval_steps_per_second": 2.213,
1216
+ "eval_wer": 2.2464126669965365,
1217
+ "step": 15500
1218
+ },
1219
+ {
1220
+ "epoch": 23.04,
1221
+ "learning_rate": 4.2998430141287285e-05,
1222
+ "loss": 2.3688,
1223
+ "step": 15600
1224
+ },
1225
+ {
1226
+ "epoch": 23.19,
1227
+ "learning_rate": 4.27629513343799e-05,
1228
+ "loss": 2.3344,
1229
+ "step": 15700
1230
+ },
1231
+ {
1232
+ "epoch": 23.34,
1233
+ "learning_rate": 4.252747252747252e-05,
1234
+ "loss": 2.3245,
1235
+ "step": 15800
1236
+ },
1237
+ {
1238
+ "epoch": 23.49,
1239
+ "learning_rate": 4.229199372056514e-05,
1240
+ "loss": 2.3523,
1241
+ "step": 15900
1242
+ },
1243
+ {
1244
+ "epoch": 23.63,
1245
+ "learning_rate": 4.205651491365777e-05,
1246
+ "loss": 2.3349,
1247
+ "step": 16000
1248
+ },
1249
+ {
1250
+ "epoch": 23.63,
1251
+ "eval_loss": 1.1144375801086426,
1252
+ "eval_runtime": 116.2412,
1253
+ "eval_samples_per_second": 17.386,
1254
+ "eval_steps_per_second": 2.177,
1255
+ "eval_wer": 2.185551707075705,
1256
+ "step": 16000
1257
+ },
1258
+ {
1259
+ "epoch": 23.78,
1260
+ "learning_rate": 4.1821036106750393e-05,
1261
+ "loss": 2.2847,
1262
+ "step": 16100
1263
+ },
1264
+ {
1265
+ "epoch": 23.93,
1266
+ "learning_rate": 4.158555729984301e-05,
1267
+ "loss": 2.3303,
1268
+ "step": 16200
1269
+ },
1270
+ {
1271
+ "epoch": 24.08,
1272
+ "learning_rate": 4.135007849293563e-05,
1273
+ "loss": 2.2994,
1274
+ "step": 16300
1275
+ },
1276
+ {
1277
+ "epoch": 24.22,
1278
+ "learning_rate": 4.111459968602825e-05,
1279
+ "loss": 2.2887,
1280
+ "step": 16400
1281
+ },
1282
+ {
1283
+ "epoch": 24.37,
1284
+ "learning_rate": 4.0879120879120874e-05,
1285
+ "loss": 2.291,
1286
+ "step": 16500
1287
+ },
1288
+ {
1289
+ "epoch": 24.37,
1290
+ "eval_loss": 1.1018128395080566,
1291
+ "eval_runtime": 114.9042,
1292
+ "eval_samples_per_second": 17.589,
1293
+ "eval_steps_per_second": 2.202,
1294
+ "eval_wer": 2.1929737753587335,
1295
+ "step": 16500
1296
+ },
1297
+ {
1298
+ "epoch": 24.52,
1299
+ "learning_rate": 4.06436420722135e-05,
1300
+ "loss": 2.2888,
1301
+ "step": 16600
1302
+ },
1303
+ {
1304
+ "epoch": 24.67,
1305
+ "learning_rate": 4.040816326530612e-05,
1306
+ "loss": 2.2724,
1307
+ "step": 16700
1308
+ },
1309
+ {
1310
+ "epoch": 24.82,
1311
+ "learning_rate": 4.017268445839874e-05,
1312
+ "loss": 2.2922,
1313
+ "step": 16800
1314
+ },
1315
+ {
1316
+ "epoch": 24.96,
1317
+ "learning_rate": 3.993720565149136e-05,
1318
+ "loss": 2.2934,
1319
+ "step": 16900
1320
+ },
1321
+ {
1322
+ "epoch": 25.11,
1323
+ "learning_rate": 3.970172684458398e-05,
1324
+ "loss": 2.2766,
1325
+ "step": 17000
1326
+ },
1327
+ {
1328
+ "epoch": 25.11,
1329
+ "eval_loss": 1.0882744789123535,
1330
+ "eval_runtime": 117.2941,
1331
+ "eval_samples_per_second": 17.23,
1332
+ "eval_steps_per_second": 2.157,
1333
+ "eval_wer": 2.1761504205838693,
1334
+ "step": 17000
1335
+ },
1336
+ {
1337
+ "epoch": 25.26,
1338
+ "learning_rate": 3.946624803767661e-05,
1339
+ "loss": 2.2656,
1340
+ "step": 17100
1341
+ },
1342
+ {
1343
+ "epoch": 25.41,
1344
+ "learning_rate": 3.9230769230769226e-05,
1345
+ "loss": 2.2929,
1346
+ "step": 17200
1347
+ },
1348
+ {
1349
+ "epoch": 25.55,
1350
+ "learning_rate": 3.899529042386185e-05,
1351
+ "loss": 2.2513,
1352
+ "step": 17300
1353
+ },
1354
+ {
1355
+ "epoch": 25.7,
1356
+ "learning_rate": 3.875981161695447e-05,
1357
+ "loss": 2.2603,
1358
+ "step": 17400
1359
+ },
1360
+ {
1361
+ "epoch": 25.85,
1362
+ "learning_rate": 3.852433281004709e-05,
1363
+ "loss": 2.2534,
1364
+ "step": 17500
1365
+ },
1366
+ {
1367
+ "epoch": 25.85,
1368
+ "eval_loss": 1.0743526220321655,
1369
+ "eval_runtime": 118.2043,
1370
+ "eval_samples_per_second": 17.098,
1371
+ "eval_steps_per_second": 2.14,
1372
+ "eval_wer": 2.1875309252845128,
1373
+ "step": 17500
1374
+ },
1375
+ {
1376
+ "epoch": 26.0,
1377
+ "learning_rate": 3.8288854003139713e-05,
1378
+ "loss": 2.2716,
1379
+ "step": 17600
1380
+ },
1381
+ {
1382
+ "epoch": 26.14,
1383
+ "learning_rate": 3.8053375196232335e-05,
1384
+ "loss": 2.2486,
1385
+ "step": 17700
1386
+ },
1387
+ {
1388
+ "epoch": 26.29,
1389
+ "learning_rate": 3.781789638932496e-05,
1390
+ "loss": 2.2068,
1391
+ "step": 17800
1392
+ },
1393
+ {
1394
+ "epoch": 26.44,
1395
+ "learning_rate": 3.758241758241758e-05,
1396
+ "loss": 2.2431,
1397
+ "step": 17900
1398
+ },
1399
+ {
1400
+ "epoch": 26.59,
1401
+ "learning_rate": 3.73469387755102e-05,
1402
+ "loss": 2.2393,
1403
+ "step": 18000
1404
+ },
1405
+ {
1406
+ "epoch": 26.59,
1407
+ "eval_loss": 1.0561192035675049,
1408
+ "eval_runtime": 116.8996,
1409
+ "eval_samples_per_second": 17.288,
1410
+ "eval_steps_per_second": 2.164,
1411
+ "eval_wer": 2.1845620979713014,
1412
+ "step": 18000
1413
+ },
1414
+ {
1415
+ "epoch": 26.74,
1416
+ "learning_rate": 3.711145996860282e-05,
1417
+ "loss": 2.1944,
1418
+ "step": 18100
1419
+ },
1420
+ {
1421
+ "epoch": 26.88,
1422
+ "learning_rate": 3.6875981161695444e-05,
1423
+ "loss": 2.2359,
1424
+ "step": 18200
1425
+ },
1426
+ {
1427
+ "epoch": 27.03,
1428
+ "learning_rate": 3.664285714285714e-05,
1429
+ "loss": 2.2097,
1430
+ "step": 18300
1431
+ },
1432
+ {
1433
+ "epoch": 27.18,
1434
+ "learning_rate": 3.640737833594976e-05,
1435
+ "loss": 2.1431,
1436
+ "step": 18400
1437
+ },
1438
+ {
1439
+ "epoch": 27.33,
1440
+ "learning_rate": 3.617189952904238e-05,
1441
+ "loss": 2.2085,
1442
+ "step": 18500
1443
+ },
1444
+ {
1445
+ "epoch": 27.33,
1446
+ "eval_loss": 1.0465816259384155,
1447
+ "eval_runtime": 115.87,
1448
+ "eval_samples_per_second": 17.442,
1449
+ "eval_steps_per_second": 2.183,
1450
+ "eval_wer": 2.1444829292429493,
1451
+ "step": 18500
1452
+ },
1453
+ {
1454
+ "epoch": 27.47,
1455
+ "learning_rate": 3.5936420722135003e-05,
1456
+ "loss": 2.2204,
1457
+ "step": 18600
1458
+ },
1459
+ {
1460
+ "epoch": 27.62,
1461
+ "learning_rate": 3.5700941915227625e-05,
1462
+ "loss": 2.242,
1463
+ "step": 18700
1464
+ },
1465
+ {
1466
+ "epoch": 27.77,
1467
+ "learning_rate": 3.546546310832025e-05,
1468
+ "loss": 2.1699,
1469
+ "step": 18800
1470
+ },
1471
+ {
1472
+ "epoch": 27.92,
1473
+ "learning_rate": 3.522998430141287e-05,
1474
+ "loss": 2.2152,
1475
+ "step": 18900
1476
+ },
1477
+ {
1478
+ "epoch": 28.06,
1479
+ "learning_rate": 3.499450549450549e-05,
1480
+ "loss": 2.1966,
1481
+ "step": 19000
1482
+ },
1483
+ {
1484
+ "epoch": 28.06,
1485
+ "eval_loss": 1.0382250547409058,
1486
+ "eval_runtime": 116.4655,
1487
+ "eval_samples_per_second": 17.353,
1488
+ "eval_steps_per_second": 2.172,
1489
+ "eval_wer": 2.1088570014844135,
1490
+ "step": 19000
1491
+ },
1492
+ {
1493
+ "epoch": 28.21,
1494
+ "learning_rate": 3.475902668759811e-05,
1495
+ "loss": 2.169,
1496
+ "step": 19100
1497
+ },
1498
+ {
1499
+ "epoch": 28.36,
1500
+ "learning_rate": 3.4523547880690734e-05,
1501
+ "loss": 2.1981,
1502
+ "step": 19200
1503
+ },
1504
+ {
1505
+ "epoch": 28.51,
1506
+ "learning_rate": 3.4288069073783356e-05,
1507
+ "loss": 2.1692,
1508
+ "step": 19300
1509
+ },
1510
+ {
1511
+ "epoch": 28.66,
1512
+ "learning_rate": 3.405259026687598e-05,
1513
+ "loss": 2.1931,
1514
+ "step": 19400
1515
+ },
1516
+ {
1517
+ "epoch": 28.8,
1518
+ "learning_rate": 3.38171114599686e-05,
1519
+ "loss": 2.1794,
1520
+ "step": 19500
1521
+ },
1522
+ {
1523
+ "epoch": 28.8,
1524
+ "eval_loss": 1.0263785123825073,
1525
+ "eval_runtime": 114.5988,
1526
+ "eval_samples_per_second": 17.635,
1527
+ "eval_steps_per_second": 2.208,
1528
+ "eval_wer": 1.9861454725383474,
1529
+ "step": 19500
1530
+ },
1531
+ {
1532
+ "epoch": 28.95,
1533
+ "learning_rate": 3.358163265306122e-05,
1534
+ "loss": 2.1638,
1535
+ "step": 19600
1536
+ },
1537
+ {
1538
+ "epoch": 29.1,
1539
+ "learning_rate": 3.334615384615384e-05,
1540
+ "loss": 2.1714,
1541
+ "step": 19700
1542
+ },
1543
+ {
1544
+ "epoch": 29.25,
1545
+ "learning_rate": 3.3110675039246465e-05,
1546
+ "loss": 2.1514,
1547
+ "step": 19800
1548
+ },
1549
+ {
1550
+ "epoch": 29.39,
1551
+ "learning_rate": 3.2875196232339087e-05,
1552
+ "loss": 2.1374,
1553
+ "step": 19900
1554
+ },
1555
+ {
1556
+ "epoch": 29.54,
1557
+ "learning_rate": 3.263971742543171e-05,
1558
+ "loss": 2.1423,
1559
+ "step": 20000
1560
+ },
1561
+ {
1562
+ "epoch": 29.54,
1563
+ "eval_loss": 1.0245550870895386,
1564
+ "eval_runtime": 116.8375,
1565
+ "eval_samples_per_second": 17.298,
1566
+ "eval_steps_per_second": 2.165,
1567
+ "eval_wer": 1.9678377041068777,
1568
+ "step": 20000
1569
+ },
1570
+ {
1571
+ "epoch": 29.69,
1572
+ "learning_rate": 3.240423861852433e-05,
1573
+ "loss": 2.1807,
1574
+ "step": 20100
1575
+ },
1576
+ {
1577
+ "epoch": 29.84,
1578
+ "learning_rate": 3.216875981161695e-05,
1579
+ "loss": 2.1545,
1580
+ "step": 20200
1581
+ },
1582
+ {
1583
+ "epoch": 29.98,
1584
+ "learning_rate": 3.1933281004709574e-05,
1585
+ "loss": 2.1404,
1586
+ "step": 20300
1587
+ },
1588
+ {
1589
+ "epoch": 30.13,
1590
+ "learning_rate": 3.1697802197802195e-05,
1591
+ "loss": 2.1089,
1592
+ "step": 20400
1593
+ },
1594
+ {
1595
+ "epoch": 30.28,
1596
+ "learning_rate": 3.146232339089482e-05,
1597
+ "loss": 2.1649,
1598
+ "step": 20500
1599
+ },
1600
+ {
1601
+ "epoch": 30.28,
1602
+ "eval_loss": 0.9981661438941956,
1603
+ "eval_runtime": 116.056,
1604
+ "eval_samples_per_second": 17.414,
1605
+ "eval_steps_per_second": 2.18,
1606
+ "eval_wer": 2.000494804552202,
1607
+ "step": 20500
1608
+ },
1609
+ {
1610
+ "epoch": 30.43,
1611
+ "learning_rate": 3.122684458398744e-05,
1612
+ "loss": 2.1425,
1613
+ "step": 20600
1614
+ },
1615
+ {
1616
+ "epoch": 30.58,
1617
+ "learning_rate": 3.099136577708006e-05,
1618
+ "loss": 2.1357,
1619
+ "step": 20700
1620
+ },
1621
+ {
1622
+ "epoch": 30.72,
1623
+ "learning_rate": 3.0758241758241755e-05,
1624
+ "loss": 2.1251,
1625
+ "step": 20800
1626
+ },
1627
+ {
1628
+ "epoch": 30.87,
1629
+ "learning_rate": 3.052276295133438e-05,
1630
+ "loss": 2.1256,
1631
+ "step": 20900
1632
+ },
1633
+ {
1634
+ "epoch": 31.02,
1635
+ "learning_rate": 3.0287284144427e-05,
1636
+ "loss": 2.143,
1637
+ "step": 21000
1638
+ },
1639
+ {
1640
+ "epoch": 31.02,
1641
+ "eval_loss": 0.9985482692718506,
1642
+ "eval_runtime": 116.0424,
1643
+ "eval_samples_per_second": 17.416,
1644
+ "eval_steps_per_second": 2.18,
1645
+ "eval_wer": 2.045027214250371,
1646
+ "step": 21000
1647
+ },
1648
+ {
1649
+ "epoch": 31.17,
1650
+ "learning_rate": 3.005180533751962e-05,
1651
+ "loss": 2.0744,
1652
+ "step": 21100
1653
+ },
1654
+ {
1655
+ "epoch": 31.31,
1656
+ "learning_rate": 2.9816326530612242e-05,
1657
+ "loss": 2.0831,
1658
+ "step": 21200
1659
+ },
1660
+ {
1661
+ "epoch": 31.46,
1662
+ "learning_rate": 2.9583202511773936e-05,
1663
+ "loss": 2.1254,
1664
+ "step": 21300
1665
+ },
1666
+ {
1667
+ "epoch": 31.61,
1668
+ "learning_rate": 2.934772370486656e-05,
1669
+ "loss": 2.1357,
1670
+ "step": 21400
1671
+ },
1672
+ {
1673
+ "epoch": 31.76,
1674
+ "learning_rate": 2.911224489795918e-05,
1675
+ "loss": 2.1338,
1676
+ "step": 21500
1677
+ },
1678
+ {
1679
+ "epoch": 31.76,
1680
+ "eval_loss": 0.9932034611701965,
1681
+ "eval_runtime": 114.6961,
1682
+ "eval_samples_per_second": 17.62,
1683
+ "eval_steps_per_second": 2.206,
1684
+ "eval_wer": 2.0024740227610094,
1685
+ "step": 21500
1686
+ },
1687
+ {
1688
+ "epoch": 31.91,
1689
+ "learning_rate": 2.8876766091051805e-05,
1690
+ "loss": 2.1053,
1691
+ "step": 21600
1692
+ },
1693
+ {
1694
+ "epoch": 32.05,
1695
+ "learning_rate": 2.8641287284144426e-05,
1696
+ "loss": 2.1111,
1697
+ "step": 21700
1698
+ },
1699
+ {
1700
+ "epoch": 32.2,
1701
+ "learning_rate": 2.8405808477237045e-05,
1702
+ "loss": 2.1028,
1703
+ "step": 21800
1704
+ },
1705
+ {
1706
+ "epoch": 32.35,
1707
+ "learning_rate": 2.817032967032967e-05,
1708
+ "loss": 2.0879,
1709
+ "step": 21900
1710
+ },
1711
+ {
1712
+ "epoch": 32.5,
1713
+ "learning_rate": 2.793485086342229e-05,
1714
+ "loss": 2.1076,
1715
+ "step": 22000
1716
+ },
1717
+ {
1718
+ "epoch": 32.5,
1719
+ "eval_loss": 0.9902665019035339,
1720
+ "eval_runtime": 120.6987,
1721
+ "eval_samples_per_second": 16.744,
1722
+ "eval_steps_per_second": 2.096,
1723
+ "eval_wer": 2.0504700643245917,
1724
+ "step": 22000
1725
+ },
1726
+ {
1727
+ "epoch": 32.64,
1728
+ "learning_rate": 2.769937205651491e-05,
1729
+ "loss": 2.1107,
1730
+ "step": 22100
1731
+ },
1732
+ {
1733
+ "epoch": 32.79,
1734
+ "learning_rate": 2.7463893249607535e-05,
1735
+ "loss": 2.0953,
1736
+ "step": 22200
1737
+ },
1738
+ {
1739
+ "epoch": 32.94,
1740
+ "learning_rate": 2.7228414442700154e-05,
1741
+ "loss": 2.0619,
1742
+ "step": 22300
1743
+ },
1744
+ {
1745
+ "epoch": 33.09,
1746
+ "learning_rate": 2.6992935635792776e-05,
1747
+ "loss": 2.0531,
1748
+ "step": 22400
1749
+ },
1750
+ {
1751
+ "epoch": 33.23,
1752
+ "learning_rate": 2.6757456828885397e-05,
1753
+ "loss": 2.0519,
1754
+ "step": 22500
1755
+ },
1756
+ {
1757
+ "epoch": 33.23,
1758
+ "eval_loss": 0.9833839535713196,
1759
+ "eval_runtime": 116.5317,
1760
+ "eval_samples_per_second": 17.343,
1761
+ "eval_steps_per_second": 2.171,
1762
+ "eval_wer": 2.07372587827808,
1763
+ "step": 22500
1764
+ },
1765
+ {
1766
+ "epoch": 33.38,
1767
+ "learning_rate": 2.652197802197802e-05,
1768
+ "loss": 2.0493,
1769
+ "step": 22600
1770
+ },
1771
+ {
1772
+ "epoch": 33.53,
1773
+ "learning_rate": 2.6286499215070644e-05,
1774
+ "loss": 2.0749,
1775
+ "step": 22700
1776
+ },
1777
+ {
1778
+ "epoch": 33.68,
1779
+ "learning_rate": 2.6051020408163263e-05,
1780
+ "loss": 2.0838,
1781
+ "step": 22800
1782
+ },
1783
+ {
1784
+ "epoch": 33.83,
1785
+ "learning_rate": 2.5815541601255884e-05,
1786
+ "loss": 2.0629,
1787
+ "step": 22900
1788
+ },
1789
+ {
1790
+ "epoch": 33.97,
1791
+ "learning_rate": 2.5580062794348506e-05,
1792
+ "loss": 2.0534,
1793
+ "step": 23000
1794
+ },
1795
+ {
1796
+ "epoch": 33.97,
1797
+ "eval_loss": 0.9755652546882629,
1798
+ "eval_runtime": 114.923,
1799
+ "eval_samples_per_second": 17.586,
1800
+ "eval_steps_per_second": 2.201,
1801
+ "eval_wer": 2.024740227610094,
1802
+ "step": 23000
1803
+ },
1804
+ {
1805
+ "epoch": 34.12,
1806
+ "learning_rate": 2.5344583987441128e-05,
1807
+ "loss": 2.067,
1808
+ "step": 23100
1809
+ },
1810
+ {
1811
+ "epoch": 34.27,
1812
+ "learning_rate": 2.5109105180533746e-05,
1813
+ "loss": 2.0252,
1814
+ "step": 23200
1815
+ },
1816
+ {
1817
+ "epoch": 34.42,
1818
+ "learning_rate": 2.487362637362637e-05,
1819
+ "loss": 2.0483,
1820
+ "step": 23300
1821
+ },
1822
+ {
1823
+ "epoch": 34.56,
1824
+ "learning_rate": 2.4638147566718993e-05,
1825
+ "loss": 2.0464,
1826
+ "step": 23400
1827
+ },
1828
+ {
1829
+ "epoch": 34.71,
1830
+ "learning_rate": 2.4402668759811615e-05,
1831
+ "loss": 2.0121,
1832
+ "step": 23500
1833
+ },
1834
+ {
1835
+ "epoch": 34.71,
1836
+ "eval_loss": 0.968792736530304,
1837
+ "eval_runtime": 114.3088,
1838
+ "eval_samples_per_second": 17.68,
1839
+ "eval_steps_per_second": 2.213,
1840
+ "eval_wer": 2.1439881246907473,
1841
+ "step": 23500
1842
+ },
1843
+ {
1844
+ "epoch": 34.86,
1845
+ "learning_rate": 2.4167189952904237e-05,
1846
+ "loss": 2.036,
1847
+ "step": 23600
1848
+ },
1849
+ {
1850
+ "epoch": 35.01,
1851
+ "learning_rate": 2.3931711145996855e-05,
1852
+ "loss": 2.013,
1853
+ "step": 23700
1854
+ },
1855
+ {
1856
+ "epoch": 35.16,
1857
+ "learning_rate": 2.369623233908948e-05,
1858
+ "loss": 2.0043,
1859
+ "step": 23800
1860
+ },
1861
+ {
1862
+ "epoch": 35.3,
1863
+ "learning_rate": 2.3460753532182102e-05,
1864
+ "loss": 2.037,
1865
+ "step": 23900
1866
+ },
1867
+ {
1868
+ "epoch": 35.45,
1869
+ "learning_rate": 2.322527472527472e-05,
1870
+ "loss": 2.0161,
1871
+ "step": 24000
1872
+ },
1873
+ {
1874
+ "epoch": 35.45,
1875
+ "eval_loss": 0.9581586718559265,
1876
+ "eval_runtime": 115.925,
1877
+ "eval_samples_per_second": 17.434,
1878
+ "eval_steps_per_second": 2.182,
1879
+ "eval_wer": 2.1232063334982683,
1880
+ "step": 24000
1881
+ },
1882
+ {
1883
+ "epoch": 35.6,
1884
+ "learning_rate": 2.2989795918367346e-05,
1885
+ "loss": 2.0256,
1886
+ "step": 24100
1887
+ },
1888
+ {
1889
+ "epoch": 35.75,
1890
+ "learning_rate": 2.2754317111459968e-05,
1891
+ "loss": 2.0265,
1892
+ "step": 24200
1893
+ },
1894
+ {
1895
+ "epoch": 35.89,
1896
+ "learning_rate": 2.251883830455259e-05,
1897
+ "loss": 2.0298,
1898
+ "step": 24300
1899
+ },
1900
+ {
1901
+ "epoch": 36.04,
1902
+ "learning_rate": 2.228335949764521e-05,
1903
+ "loss": 2.0028,
1904
+ "step": 24400
1905
+ },
1906
+ {
1907
+ "epoch": 36.19,
1908
+ "learning_rate": 2.204788069073783e-05,
1909
+ "loss": 2.0178,
1910
+ "step": 24500
1911
+ },
1912
+ {
1913
+ "epoch": 36.19,
1914
+ "eval_loss": 0.9480372071266174,
1915
+ "eval_runtime": 116.8212,
1916
+ "eval_samples_per_second": 17.3,
1917
+ "eval_steps_per_second": 2.166,
1918
+ "eval_wer": 2.0895596239485403,
1919
+ "step": 24500
1920
+ },
1921
+ {
1922
+ "epoch": 36.34,
1923
+ "learning_rate": 2.1812401883830455e-05,
1924
+ "loss": 2.008,
1925
+ "step": 24600
1926
+ },
1927
+ {
1928
+ "epoch": 36.48,
1929
+ "learning_rate": 2.1576923076923076e-05,
1930
+ "loss": 2.0132,
1931
+ "step": 24700
1932
+ },
1933
+ {
1934
+ "epoch": 36.63,
1935
+ "learning_rate": 2.1341444270015695e-05,
1936
+ "loss": 2.0204,
1937
+ "step": 24800
1938
+ },
1939
+ {
1940
+ "epoch": 36.78,
1941
+ "learning_rate": 2.110596546310832e-05,
1942
+ "loss": 1.9806,
1943
+ "step": 24900
1944
+ },
1945
+ {
1946
+ "epoch": 36.93,
1947
+ "learning_rate": 2.087048665620094e-05,
1948
+ "loss": 2.0154,
1949
+ "step": 25000
1950
+ },
1951
+ {
1952
+ "epoch": 36.93,
1953
+ "eval_loss": 0.9483017325401306,
1954
+ "eval_runtime": 117.4294,
1955
+ "eval_samples_per_second": 17.21,
1956
+ "eval_steps_per_second": 2.154,
1957
+ "eval_wer": 2.078673923800099,
1958
+ "step": 25000
1959
+ },
1960
+ {
1961
+ "epoch": 37.08,
1962
+ "learning_rate": 2.063500784929356e-05,
1963
+ "loss": 1.997,
1964
+ "step": 25100
1965
+ },
1966
+ {
1967
+ "epoch": 37.22,
1968
+ "learning_rate": 2.0399529042386185e-05,
1969
+ "loss": 1.9712,
1970
+ "step": 25200
1971
+ },
1972
+ {
1973
+ "epoch": 37.37,
1974
+ "learning_rate": 2.0164050235478804e-05,
1975
+ "loss": 2.0131,
1976
+ "step": 25300
1977
+ },
1978
+ {
1979
+ "epoch": 37.52,
1980
+ "learning_rate": 1.992857142857143e-05,
1981
+ "loss": 1.9605,
1982
+ "step": 25400
1983
+ },
1984
+ {
1985
+ "epoch": 37.67,
1986
+ "learning_rate": 1.9695447409733123e-05,
1987
+ "loss": 1.9966,
1988
+ "step": 25500
1989
+ },
1990
+ {
1991
+ "epoch": 37.67,
1992
+ "eval_loss": 0.940608024597168,
1993
+ "eval_runtime": 115.2635,
1994
+ "eval_samples_per_second": 17.534,
1995
+ "eval_steps_per_second": 2.195,
1996
+ "eval_wer": 2.0296882731321126,
1997
+ "step": 25500
1998
+ },
1999
+ {
2000
+ "epoch": 37.81,
2001
+ "learning_rate": 1.945996860282574e-05,
2002
+ "loss": 1.9879,
2003
+ "step": 25600
2004
+ },
2005
+ {
2006
+ "epoch": 37.96,
2007
+ "learning_rate": 1.9224489795918367e-05,
2008
+ "loss": 1.9836,
2009
+ "step": 25700
2010
+ },
2011
+ {
2012
+ "epoch": 38.11,
2013
+ "learning_rate": 1.8989010989010988e-05,
2014
+ "loss": 1.9872,
2015
+ "step": 25800
2016
+ },
2017
+ {
2018
+ "epoch": 38.26,
2019
+ "learning_rate": 1.8753532182103607e-05,
2020
+ "loss": 1.9684,
2021
+ "step": 25900
2022
+ },
2023
+ {
2024
+ "epoch": 38.4,
2025
+ "learning_rate": 1.851805337519623e-05,
2026
+ "loss": 1.9753,
2027
+ "step": 26000
2028
+ },
2029
+ {
2030
+ "epoch": 38.4,
2031
+ "eval_loss": 0.9418594837188721,
2032
+ "eval_runtime": 115.7124,
2033
+ "eval_samples_per_second": 17.466,
2034
+ "eval_steps_per_second": 2.186,
2035
+ "eval_wer": 2.0346363186541314,
2036
+ "step": 26000
2037
+ },
2038
+ {
2039
+ "epoch": 38.55,
2040
+ "learning_rate": 1.828257456828885e-05,
2041
+ "loss": 1.9926,
2042
+ "step": 26100
2043
+ },
2044
+ {
2045
+ "epoch": 38.7,
2046
+ "learning_rate": 1.8047095761381475e-05,
2047
+ "loss": 1.9685,
2048
+ "step": 26200
2049
+ },
2050
+ {
2051
+ "epoch": 38.85,
2052
+ "learning_rate": 1.7811616954474097e-05,
2053
+ "loss": 1.9707,
2054
+ "step": 26300
2055
+ },
2056
+ {
2057
+ "epoch": 39.0,
2058
+ "learning_rate": 1.7576138147566716e-05,
2059
+ "loss": 1.9477,
2060
+ "step": 26400
2061
+ },
2062
+ {
2063
+ "epoch": 39.14,
2064
+ "learning_rate": 1.7340659340659337e-05,
2065
+ "loss": 1.9524,
2066
+ "step": 26500
2067
+ },
2068
+ {
2069
+ "epoch": 39.14,
2070
+ "eval_loss": 0.927354097366333,
2071
+ "eval_runtime": 115.8614,
2072
+ "eval_samples_per_second": 17.443,
2073
+ "eval_steps_per_second": 2.184,
2074
+ "eval_wer": 2.0697674418604652,
2075
+ "step": 26500
2076
+ },
2077
+ {
2078
+ "epoch": 39.29,
2079
+ "learning_rate": 1.7105180533751963e-05,
2080
+ "loss": 1.9673,
2081
+ "step": 26600
2082
+ },
2083
+ {
2084
+ "epoch": 39.44,
2085
+ "learning_rate": 1.6869701726844584e-05,
2086
+ "loss": 1.9802,
2087
+ "step": 26700
2088
+ },
2089
+ {
2090
+ "epoch": 39.59,
2091
+ "learning_rate": 1.6634222919937203e-05,
2092
+ "loss": 1.9408,
2093
+ "step": 26800
2094
+ },
2095
+ {
2096
+ "epoch": 39.73,
2097
+ "learning_rate": 1.6398744113029824e-05,
2098
+ "loss": 1.9482,
2099
+ "step": 26900
2100
+ },
2101
+ {
2102
+ "epoch": 39.88,
2103
+ "learning_rate": 1.6163265306122446e-05,
2104
+ "loss": 1.9427,
2105
+ "step": 27000
2106
+ },
2107
+ {
2108
+ "epoch": 39.88,
2109
+ "eval_loss": 0.9232719540596008,
2110
+ "eval_runtime": 116.3191,
2111
+ "eval_samples_per_second": 17.375,
2112
+ "eval_steps_per_second": 2.175,
2113
+ "eval_wer": 2.078673923800099,
2114
+ "step": 27000
2115
+ },
2116
+ {
2117
+ "epoch": 40.03,
2118
+ "learning_rate": 1.592778649921507e-05,
2119
+ "loss": 1.9653,
2120
+ "step": 27100
2121
+ },
2122
+ {
2123
+ "epoch": 40.18,
2124
+ "learning_rate": 1.569230769230769e-05,
2125
+ "loss": 1.9157,
2126
+ "step": 27200
2127
+ },
2128
+ {
2129
+ "epoch": 40.32,
2130
+ "learning_rate": 1.545682888540031e-05,
2131
+ "loss": 1.9493,
2132
+ "step": 27300
2133
+ },
2134
+ {
2135
+ "epoch": 40.47,
2136
+ "learning_rate": 1.5221350078492935e-05,
2137
+ "loss": 1.8974,
2138
+ "step": 27400
2139
+ },
2140
+ {
2141
+ "epoch": 40.62,
2142
+ "learning_rate": 1.4985871271585557e-05,
2143
+ "loss": 1.9258,
2144
+ "step": 27500
2145
+ },
2146
+ {
2147
+ "epoch": 40.62,
2148
+ "eval_loss": 0.9182448983192444,
2149
+ "eval_runtime": 115.4065,
2150
+ "eval_samples_per_second": 17.512,
2151
+ "eval_steps_per_second": 2.192,
2152
+ "eval_wer": 2.052944087085601,
2153
+ "step": 27500
2154
+ },
2155
+ {
2156
+ "epoch": 40.77,
2157
+ "learning_rate": 1.4750392464678177e-05,
2158
+ "loss": 1.9354,
2159
+ "step": 27600
2160
+ },
2161
+ {
2162
+ "epoch": 40.92,
2163
+ "learning_rate": 1.4514913657770799e-05,
2164
+ "loss": 1.952,
2165
+ "step": 27700
2166
+ },
2167
+ {
2168
+ "epoch": 41.06,
2169
+ "learning_rate": 1.4281789638932496e-05,
2170
+ "loss": 1.9231,
2171
+ "step": 27800
2172
+ },
2173
+ {
2174
+ "epoch": 41.21,
2175
+ "learning_rate": 1.4046310832025116e-05,
2176
+ "loss": 1.9465,
2177
+ "step": 27900
2178
+ },
2179
+ {
2180
+ "epoch": 41.36,
2181
+ "learning_rate": 1.3810832025117738e-05,
2182
+ "loss": 1.9031,
2183
+ "step": 28000
2184
+ },
2185
+ {
2186
+ "epoch": 41.36,
2187
+ "eval_loss": 0.9149593114852905,
2188
+ "eval_runtime": 116.2555,
2189
+ "eval_samples_per_second": 17.384,
2190
+ "eval_steps_per_second": 2.176,
2191
+ "eval_wer": 2.078673923800099,
2192
+ "step": 28000
2193
+ },
2194
+ {
2195
+ "epoch": 41.51,
2196
+ "learning_rate": 1.357535321821036e-05,
2197
+ "loss": 1.9361,
2198
+ "step": 28100
2199
+ },
2200
+ {
2201
+ "epoch": 41.65,
2202
+ "learning_rate": 1.3342229199372054e-05,
2203
+ "loss": 1.916,
2204
+ "step": 28200
2205
+ },
2206
+ {
2207
+ "epoch": 41.8,
2208
+ "learning_rate": 1.3106750392464677e-05,
2209
+ "loss": 1.9149,
2210
+ "step": 28300
2211
+ },
2212
+ {
2213
+ "epoch": 41.95,
2214
+ "learning_rate": 1.2871271585557299e-05,
2215
+ "loss": 1.9037,
2216
+ "step": 28400
2217
+ },
2218
+ {
2219
+ "epoch": 42.1,
2220
+ "learning_rate": 1.263579277864992e-05,
2221
+ "loss": 1.9297,
2222
+ "step": 28500
2223
+ },
2224
+ {
2225
+ "epoch": 42.1,
2226
+ "eval_loss": 0.9040070176124573,
2227
+ "eval_runtime": 113.8901,
2228
+ "eval_samples_per_second": 17.745,
2229
+ "eval_steps_per_second": 2.221,
2230
+ "eval_wer": 2.0504700643245917,
2231
+ "step": 28500
2232
+ },
2233
+ {
2234
+ "epoch": 42.25,
2235
+ "learning_rate": 1.2400313971742541e-05,
2236
+ "loss": 1.8855,
2237
+ "step": 28600
2238
+ },
2239
+ {
2240
+ "epoch": 42.39,
2241
+ "learning_rate": 1.2164835164835163e-05,
2242
+ "loss": 1.9095,
2243
+ "step": 28700
2244
+ },
2245
+ {
2246
+ "epoch": 42.54,
2247
+ "learning_rate": 1.1929356357927786e-05,
2248
+ "loss": 1.8913,
2249
+ "step": 28800
2250
+ },
2251
+ {
2252
+ "epoch": 42.69,
2253
+ "learning_rate": 1.1693877551020408e-05,
2254
+ "loss": 1.8685,
2255
+ "step": 28900
2256
+ },
2257
+ {
2258
+ "epoch": 42.84,
2259
+ "learning_rate": 1.1458398744113028e-05,
2260
+ "loss": 1.9041,
2261
+ "step": 29000
2262
+ },
2263
+ {
2264
+ "epoch": 42.84,
2265
+ "eval_loss": 0.9008907675743103,
2266
+ "eval_runtime": 114.9643,
2267
+ "eval_samples_per_second": 17.579,
2268
+ "eval_steps_per_second": 2.201,
2269
+ "eval_wer": 2.05789213260762,
2270
+ "step": 29000
2271
+ },
2272
+ {
2273
+ "epoch": 42.98,
2274
+ "learning_rate": 1.122291993720565e-05,
2275
+ "loss": 1.8963,
2276
+ "step": 29100
2277
+ },
2278
+ {
2279
+ "epoch": 43.13,
2280
+ "learning_rate": 1.0987441130298273e-05,
2281
+ "loss": 1.9068,
2282
+ "step": 29200
2283
+ },
2284
+ {
2285
+ "epoch": 43.28,
2286
+ "learning_rate": 1.0751962323390895e-05,
2287
+ "loss": 1.9003,
2288
+ "step": 29300
2289
+ },
2290
+ {
2291
+ "epoch": 43.43,
2292
+ "learning_rate": 1.0516483516483515e-05,
2293
+ "loss": 1.891,
2294
+ "step": 29400
2295
+ },
2296
+ {
2297
+ "epoch": 43.57,
2298
+ "learning_rate": 1.0281004709576137e-05,
2299
+ "loss": 1.8929,
2300
+ "step": 29500
2301
+ },
2302
+ {
2303
+ "epoch": 43.57,
2304
+ "eval_loss": 0.8968304991722107,
2305
+ "eval_runtime": 116.4378,
2306
+ "eval_samples_per_second": 17.357,
2307
+ "eval_steps_per_second": 2.173,
2308
+ "eval_wer": 2.032657100445324,
2309
+ "step": 29500
2310
+ },
2311
+ {
2312
+ "epoch": 43.72,
2313
+ "learning_rate": 1.0045525902668759e-05,
2314
+ "loss": 1.8827,
2315
+ "step": 29600
2316
+ },
2317
+ {
2318
+ "epoch": 43.87,
2319
+ "learning_rate": 9.810047095761382e-06,
2320
+ "loss": 1.8862,
2321
+ "step": 29700
2322
+ },
2323
+ {
2324
+ "epoch": 44.02,
2325
+ "learning_rate": 9.574568288854002e-06,
2326
+ "loss": 1.8787,
2327
+ "step": 29800
2328
+ },
2329
+ {
2330
+ "epoch": 44.17,
2331
+ "learning_rate": 9.339089481946624e-06,
2332
+ "loss": 1.8501,
2333
+ "step": 29900
2334
+ },
2335
+ {
2336
+ "epoch": 44.31,
2337
+ "learning_rate": 9.103610675039246e-06,
2338
+ "loss": 1.9077,
2339
+ "step": 30000
2340
+ },
2341
+ {
2342
+ "epoch": 44.31,
2343
+ "eval_loss": 0.8953686952590942,
2344
+ "eval_runtime": 115.4838,
2345
+ "eval_samples_per_second": 17.5,
2346
+ "eval_steps_per_second": 2.191,
2347
+ "eval_wer": 2.061850569025235,
2348
+ "step": 30000
2349
+ },
2350
+ {
2351
+ "epoch": 44.46,
2352
+ "learning_rate": 8.868131868131868e-06,
2353
+ "loss": 1.8804,
2354
+ "step": 30100
2355
+ },
2356
+ {
2357
+ "epoch": 44.61,
2358
+ "learning_rate": 8.63265306122449e-06,
2359
+ "loss": 1.8723,
2360
+ "step": 30200
2361
+ },
2362
+ {
2363
+ "epoch": 44.76,
2364
+ "learning_rate": 8.397174254317111e-06,
2365
+ "loss": 1.8577,
2366
+ "step": 30300
2367
+ },
2368
+ {
2369
+ "epoch": 44.9,
2370
+ "learning_rate": 8.161695447409733e-06,
2371
+ "loss": 1.8811,
2372
+ "step": 30400
2373
+ },
2374
+ {
2375
+ "epoch": 45.05,
2376
+ "learning_rate": 7.928571428571429e-06,
2377
+ "loss": 1.8504,
2378
+ "step": 30500
2379
+ },
2380
+ {
2381
+ "epoch": 45.05,
2382
+ "eval_loss": 0.892192542552948,
2383
+ "eval_runtime": 116.2513,
2384
+ "eval_samples_per_second": 17.385,
2385
+ "eval_steps_per_second": 2.176,
2386
+ "eval_wer": 2.07372587827808,
2387
+ "step": 30500
2388
+ },
2389
+ {
2390
+ "epoch": 45.2,
2391
+ "learning_rate": 7.693092621664049e-06,
2392
+ "loss": 1.861,
2393
+ "step": 30600
2394
+ },
2395
+ {
2396
+ "epoch": 45.35,
2397
+ "learning_rate": 7.457613814756671e-06,
2398
+ "loss": 1.8496,
2399
+ "step": 30700
2400
+ },
2401
+ {
2402
+ "epoch": 45.49,
2403
+ "learning_rate": 7.222135007849293e-06,
2404
+ "loss": 1.8612,
2405
+ "step": 30800
2406
+ },
2407
+ {
2408
+ "epoch": 45.64,
2409
+ "learning_rate": 6.986656200941915e-06,
2410
+ "loss": 1.865,
2411
+ "step": 30900
2412
+ },
2413
+ {
2414
+ "epoch": 45.79,
2415
+ "learning_rate": 6.751177394034536e-06,
2416
+ "loss": 1.8732,
2417
+ "step": 31000
2418
+ },
2419
+ {
2420
+ "epoch": 45.79,
2421
+ "eval_loss": 0.8897548317909241,
2422
+ "eval_runtime": 116.5927,
2423
+ "eval_samples_per_second": 17.334,
2424
+ "eval_steps_per_second": 2.17,
2425
+ "eval_wer": 2.0682830282038593,
2426
+ "step": 31000
2427
+ },
2428
+ {
2429
+ "epoch": 45.94,
2430
+ "learning_rate": 6.5156985871271585e-06,
2431
+ "loss": 1.8374,
2432
+ "step": 31100
2433
+ },
2434
+ {
2435
+ "epoch": 46.09,
2436
+ "learning_rate": 6.280219780219779e-06,
2437
+ "loss": 1.8395,
2438
+ "step": 31200
2439
+ },
2440
+ {
2441
+ "epoch": 46.23,
2442
+ "learning_rate": 6.044740973312402e-06,
2443
+ "loss": 1.8377,
2444
+ "step": 31300
2445
+ },
2446
+ {
2447
+ "epoch": 46.38,
2448
+ "learning_rate": 5.809262166405023e-06,
2449
+ "loss": 1.87,
2450
+ "step": 31400
2451
+ },
2452
+ {
2453
+ "epoch": 46.53,
2454
+ "learning_rate": 5.573783359497644e-06,
2455
+ "loss": 1.877,
2456
+ "step": 31500
2457
+ },
2458
+ {
2459
+ "epoch": 46.53,
2460
+ "eval_loss": 0.8848925828933716,
2461
+ "eval_runtime": 116.1465,
2462
+ "eval_samples_per_second": 17.4,
2463
+ "eval_steps_per_second": 2.178,
2464
+ "eval_wer": 2.0588817417120238,
2465
+ "step": 31500
2466
+ },
2467
+ {
2468
+ "epoch": 46.68,
2469
+ "learning_rate": 5.3383045525902665e-06,
2470
+ "loss": 1.8256,
2471
+ "step": 31600
2472
+ },
2473
+ {
2474
+ "epoch": 46.82,
2475
+ "learning_rate": 5.1028257456828875e-06,
2476
+ "loss": 1.8317,
2477
+ "step": 31700
2478
+ },
2479
+ {
2480
+ "epoch": 46.97,
2481
+ "learning_rate": 4.86734693877551e-06,
2482
+ "loss": 1.8579,
2483
+ "step": 31800
2484
+ },
2485
+ {
2486
+ "epoch": 47.12,
2487
+ "learning_rate": 4.631868131868132e-06,
2488
+ "loss": 1.839,
2489
+ "step": 31900
2490
+ },
2491
+ {
2492
+ "epoch": 47.27,
2493
+ "learning_rate": 4.396389324960754e-06,
2494
+ "loss": 1.8587,
2495
+ "step": 32000
2496
+ },
2497
+ {
2498
+ "epoch": 47.27,
2499
+ "eval_loss": 0.8843359351158142,
2500
+ "eval_runtime": 116.5866,
2501
+ "eval_samples_per_second": 17.335,
2502
+ "eval_steps_per_second": 2.17,
2503
+ "eval_wer": 2.045027214250371,
2504
+ "step": 32000
2505
+ },
2506
+ {
2507
+ "epoch": 47.41,
2508
+ "learning_rate": 4.160910518053375e-06,
2509
+ "loss": 1.8419,
2510
+ "step": 32100
2511
+ },
2512
+ {
2513
+ "epoch": 47.56,
2514
+ "learning_rate": 3.925431711145996e-06,
2515
+ "loss": 1.8639,
2516
+ "step": 32200
2517
+ },
2518
+ {
2519
+ "epoch": 47.71,
2520
+ "learning_rate": 3.6899529042386186e-06,
2521
+ "loss": 1.8395,
2522
+ "step": 32300
2523
+ },
2524
+ {
2525
+ "epoch": 47.86,
2526
+ "learning_rate": 3.45447409733124e-06,
2527
+ "loss": 1.8369,
2528
+ "step": 32400
2529
+ },
2530
+ {
2531
+ "epoch": 48.01,
2532
+ "learning_rate": 3.2189952904238617e-06,
2533
+ "loss": 1.8236,
2534
+ "step": 32500
2535
+ },
2536
+ {
2537
+ "epoch": 48.01,
2538
+ "eval_loss": 0.8810222148895264,
2539
+ "eval_runtime": 115.817,
2540
+ "eval_samples_per_second": 17.45,
2541
+ "eval_steps_per_second": 2.184,
2542
+ "eval_wer": 2.0554181098466104,
2543
+ "step": 32500
2544
+ }
2545
+ ],
2546
+ "max_steps": 33850,
2547
+ "num_train_epochs": 50,
2548
+ "total_flos": 1.478948944631317e+20,
2549
+ "trial_name": null,
2550
+ "trial_params": null
2551
+ }
checkpoint-32500/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:865e00e6503ba9a84aae151f6fca7048aa8118080935253305e52fcf3ebbf980
3
+ size 3055
checkpoint-33000/config.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
+ "activation_dropout": 0.1,
4
+ "adapter_kernel_size": 3,
5
+ "adapter_stride": 2,
6
+ "add_adapter": false,
7
+ "apply_spec_augment": true,
8
+ "architectures": [
9
+ "Wav2Vec2ForCTC"
10
+ ],
11
+ "attention_dropout": 0.0,
12
+ "bos_token_id": 1,
13
+ "classifier_proj_size": 256,
14
+ "codevector_dim": 768,
15
+ "contrastive_logits_temperature": 0.1,
16
+ "conv_bias": true,
17
+ "conv_dim": [
18
+ 512,
19
+ 512,
20
+ 512,
21
+ 512,
22
+ 512,
23
+ 512,
24
+ 512
25
+ ],
26
+ "conv_kernel": [
27
+ 10,
28
+ 3,
29
+ 3,
30
+ 3,
31
+ 3,
32
+ 2,
33
+ 2
34
+ ],
35
+ "conv_stride": [
36
+ 5,
37
+ 2,
38
+ 2,
39
+ 2,
40
+ 2,
41
+ 2,
42
+ 2
43
+ ],
44
+ "ctc_loss_reduction": "mean",
45
+ "ctc_zero_infinity": false,
46
+ "diversity_loss_weight": 0.1,
47
+ "do_stable_layer_norm": true,
48
+ "eos_token_id": 2,
49
+ "feat_extract_activation": "gelu",
50
+ "feat_extract_dropout": 0.0,
51
+ "feat_extract_norm": "layer",
52
+ "feat_proj_dropout": 0.0,
53
+ "feat_quantizer_dropout": 0.0,
54
+ "final_dropout": 0.0,
55
+ "hidden_act": "gelu",
56
+ "hidden_dropout": 0.0,
57
+ "hidden_size": 1024,
58
+ "initializer_range": 0.02,
59
+ "intermediate_size": 4096,
60
+ "layer_norm_eps": 1e-05,
61
+ "layerdrop": 0.0,
62
+ "mask_feature_length": 64,
63
+ "mask_feature_min_masks": 0,
64
+ "mask_feature_prob": 0.25,
65
+ "mask_time_length": 10,
66
+ "mask_time_min_masks": 2,
67
+ "mask_time_prob": 0.75,
68
+ "model_type": "wav2vec2",
69
+ "num_adapter_layers": 3,
70
+ "num_attention_heads": 16,
71
+ "num_codevector_groups": 2,
72
+ "num_codevectors_per_group": 320,
73
+ "num_conv_pos_embedding_groups": 16,
74
+ "num_conv_pos_embeddings": 128,
75
+ "num_feat_extract_layers": 7,
76
+ "num_hidden_layers": 24,
77
+ "num_negatives": 100,
78
+ "output_hidden_size": 1024,
79
+ "pad_token_id": 4649,
80
+ "proj_codevector_dim": 768,
81
+ "tdnn_dilation": [
82
+ 1,
83
+ 2,
84
+ 3,
85
+ 1,
86
+ 1
87
+ ],
88
+ "tdnn_dim": [
89
+ 512,
90
+ 512,
91
+ 512,
92
+ 512,
93
+ 1500
94
+ ],
95
+ "tdnn_kernel": [
96
+ 5,
97
+ 3,
98
+ 3,
99
+ 1,
100
+ 1
101
+ ],
102
+ "torch_dtype": "float32",
103
+ "transformers_version": "4.17.0.dev0",
104
+ "use_weighted_layer_sum": false,
105
+ "vocab_size": 4652,
106
+ "xvector_output_dim": 512
107
+ }
checkpoint-33000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:38a8a5fdff63d4a4378f4590be6c24c0b9a0f8496fb6cf89371362c9653f6984
3
+ size 2528205329
checkpoint-33000/preprocessor_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": true,
8
+ "sampling_rate": 16000
9
+ }
checkpoint-33000/pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f6e43ed7bda7dc19435f9bdb6091a5bdc5822ed485d431eb6801940f6528e40
3
+ size 1280996913
checkpoint-33000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79d02bfd17cc7b1be60e36af047bebbbca3283c21088c5ce57b9d4ad23318c14
3
+ size 14567
checkpoint-33000/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:178406c3c23cad01fd09a6cc52e098b225f027bded38f500b369d3d2fd9c5aa6
3
+ size 559
checkpoint-33000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:daa000cad23b84da64cbc3d157f4eaa6f9a33d6eb1ccc5a59e2749c28a6c719b
3
+ size 623
checkpoint-33000/trainer_state.json ADDED
@@ -0,0 +1,2590 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 48.74418604651163,
5
+ "global_step": 33000,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 0.15,
12
+ "learning_rate": 3.6375e-06,
13
+ "loss": 124.9665,
14
+ "step": 100
15
+ },
16
+ {
17
+ "epoch": 0.3,
18
+ "learning_rate": 7.3875e-06,
19
+ "loss": 92.673,
20
+ "step": 200
21
+ },
22
+ {
23
+ "epoch": 0.44,
24
+ "learning_rate": 1.1099999999999999e-05,
25
+ "loss": 74.8932,
26
+ "step": 300
27
+ },
28
+ {
29
+ "epoch": 0.59,
30
+ "learning_rate": 1.485e-05,
31
+ "loss": 68.0432,
32
+ "step": 400
33
+ },
34
+ {
35
+ "epoch": 0.74,
36
+ "learning_rate": 1.8599999999999998e-05,
37
+ "loss": 60.2112,
38
+ "step": 500
39
+ },
40
+ {
41
+ "epoch": 0.74,
42
+ "eval_loss": 64.81886291503906,
43
+ "eval_runtime": 129.9516,
44
+ "eval_samples_per_second": 15.552,
45
+ "eval_steps_per_second": 1.947,
46
+ "eval_wer": 1.0,
47
+ "step": 500
48
+ },
49
+ {
50
+ "epoch": 0.89,
51
+ "learning_rate": 2.2349999999999998e-05,
52
+ "loss": 51.3096,
53
+ "step": 600
54
+ },
55
+ {
56
+ "epoch": 1.03,
57
+ "learning_rate": 2.6099999999999997e-05,
58
+ "loss": 39.1106,
59
+ "step": 700
60
+ },
61
+ {
62
+ "epoch": 1.18,
63
+ "learning_rate": 2.985e-05,
64
+ "loss": 26.6843,
65
+ "step": 800
66
+ },
67
+ {
68
+ "epoch": 1.33,
69
+ "learning_rate": 3.36e-05,
70
+ "loss": 14.7864,
71
+ "step": 900
72
+ },
73
+ {
74
+ "epoch": 1.48,
75
+ "learning_rate": 3.735e-05,
76
+ "loss": 8.1128,
77
+ "step": 1000
78
+ },
79
+ {
80
+ "epoch": 1.48,
81
+ "eval_loss": 6.899676322937012,
82
+ "eval_runtime": 115.5788,
83
+ "eval_samples_per_second": 17.486,
84
+ "eval_steps_per_second": 2.189,
85
+ "eval_wer": 1.0,
86
+ "step": 1000
87
+ },
88
+ {
89
+ "epoch": 1.62,
90
+ "learning_rate": 4.11e-05,
91
+ "loss": 6.6068,
92
+ "step": 1100
93
+ },
94
+ {
95
+ "epoch": 1.77,
96
+ "learning_rate": 4.484999999999999e-05,
97
+ "loss": 6.23,
98
+ "step": 1200
99
+ },
100
+ {
101
+ "epoch": 1.92,
102
+ "learning_rate": 4.8599999999999995e-05,
103
+ "loss": 6.0972,
104
+ "step": 1300
105
+ },
106
+ {
107
+ "epoch": 2.07,
108
+ "learning_rate": 5.234999999999999e-05,
109
+ "loss": 6.0595,
110
+ "step": 1400
111
+ },
112
+ {
113
+ "epoch": 2.22,
114
+ "learning_rate": 5.6099999999999995e-05,
115
+ "loss": 6.0492,
116
+ "step": 1500
117
+ },
118
+ {
119
+ "epoch": 2.22,
120
+ "eval_loss": 5.967654228210449,
121
+ "eval_runtime": 115.432,
122
+ "eval_samples_per_second": 17.508,
123
+ "eval_steps_per_second": 2.192,
124
+ "eval_wer": 1.949529935675408,
125
+ "step": 1500
126
+ },
127
+ {
128
+ "epoch": 2.36,
129
+ "learning_rate": 5.985e-05,
130
+ "loss": 6.0266,
131
+ "step": 1600
132
+ },
133
+ {
134
+ "epoch": 2.51,
135
+ "learning_rate": 6.359999999999999e-05,
136
+ "loss": 5.9902,
137
+ "step": 1700
138
+ },
139
+ {
140
+ "epoch": 2.66,
141
+ "learning_rate": 6.735e-05,
142
+ "loss": 5.9762,
143
+ "step": 1800
144
+ },
145
+ {
146
+ "epoch": 2.81,
147
+ "learning_rate": 7.11e-05,
148
+ "loss": 5.9491,
149
+ "step": 1900
150
+ },
151
+ {
152
+ "epoch": 2.95,
153
+ "learning_rate": 7.484999999999999e-05,
154
+ "loss": 5.9326,
155
+ "step": 2000
156
+ },
157
+ {
158
+ "epoch": 2.95,
159
+ "eval_loss": 5.884542942047119,
160
+ "eval_runtime": 114.597,
161
+ "eval_samples_per_second": 17.636,
162
+ "eval_steps_per_second": 2.208,
163
+ "eval_wer": 1.409203364670955,
164
+ "step": 2000
165
+ },
166
+ {
167
+ "epoch": 3.1,
168
+ "learning_rate": 7.477394034536891e-05,
169
+ "loss": 5.9356,
170
+ "step": 2100
171
+ },
172
+ {
173
+ "epoch": 3.25,
174
+ "learning_rate": 7.453846153846153e-05,
175
+ "loss": 5.8889,
176
+ "step": 2200
177
+ },
178
+ {
179
+ "epoch": 3.4,
180
+ "learning_rate": 7.430298273155415e-05,
181
+ "loss": 5.899,
182
+ "step": 2300
183
+ },
184
+ {
185
+ "epoch": 3.54,
186
+ "learning_rate": 7.406750392464678e-05,
187
+ "loss": 5.8824,
188
+ "step": 2400
189
+ },
190
+ {
191
+ "epoch": 3.69,
192
+ "learning_rate": 7.38320251177394e-05,
193
+ "loss": 5.8763,
194
+ "step": 2500
195
+ },
196
+ {
197
+ "epoch": 3.69,
198
+ "eval_loss": 5.846009731292725,
199
+ "eval_runtime": 117.5393,
200
+ "eval_samples_per_second": 17.194,
201
+ "eval_steps_per_second": 2.152,
202
+ "eval_wer": 1.6125680356259278,
203
+ "step": 2500
204
+ },
205
+ {
206
+ "epoch": 3.84,
207
+ "learning_rate": 7.359654631083201e-05,
208
+ "loss": 5.875,
209
+ "step": 2600
210
+ },
211
+ {
212
+ "epoch": 3.99,
213
+ "learning_rate": 7.336106750392464e-05,
214
+ "loss": 5.8671,
215
+ "step": 2700
216
+ },
217
+ {
218
+ "epoch": 4.14,
219
+ "learning_rate": 7.312558869701726e-05,
220
+ "loss": 5.8591,
221
+ "step": 2800
222
+ },
223
+ {
224
+ "epoch": 4.28,
225
+ "learning_rate": 7.289010989010989e-05,
226
+ "loss": 5.8226,
227
+ "step": 2900
228
+ },
229
+ {
230
+ "epoch": 4.43,
231
+ "learning_rate": 7.265463108320251e-05,
232
+ "loss": 5.7888,
233
+ "step": 3000
234
+ },
235
+ {
236
+ "epoch": 4.43,
237
+ "eval_loss": 5.75445032119751,
238
+ "eval_runtime": 114.1832,
239
+ "eval_samples_per_second": 17.7,
240
+ "eval_steps_per_second": 2.216,
241
+ "eval_wer": 2.2033646709549726,
242
+ "step": 3000
243
+ },
244
+ {
245
+ "epoch": 4.58,
246
+ "learning_rate": 7.241915227629513e-05,
247
+ "loss": 5.8041,
248
+ "step": 3100
249
+ },
250
+ {
251
+ "epoch": 4.73,
252
+ "learning_rate": 7.218367346938774e-05,
253
+ "loss": 5.8013,
254
+ "step": 3200
255
+ },
256
+ {
257
+ "epoch": 4.87,
258
+ "learning_rate": 7.194819466248037e-05,
259
+ "loss": 5.7947,
260
+ "step": 3300
261
+ },
262
+ {
263
+ "epoch": 5.02,
264
+ "learning_rate": 7.171271585557299e-05,
265
+ "loss": 5.7802,
266
+ "step": 3400
267
+ },
268
+ {
269
+ "epoch": 5.17,
270
+ "learning_rate": 7.147723704866562e-05,
271
+ "loss": 5.735,
272
+ "step": 3500
273
+ },
274
+ {
275
+ "epoch": 5.17,
276
+ "eval_loss": 5.677657604217529,
277
+ "eval_runtime": 115.6516,
278
+ "eval_samples_per_second": 17.475,
279
+ "eval_steps_per_second": 2.188,
280
+ "eval_wer": 2.334982681840673,
281
+ "step": 3500
282
+ },
283
+ {
284
+ "epoch": 5.32,
285
+ "learning_rate": 7.124175824175823e-05,
286
+ "loss": 5.7198,
287
+ "step": 3600
288
+ },
289
+ {
290
+ "epoch": 5.47,
291
+ "learning_rate": 7.100627943485086e-05,
292
+ "loss": 5.7092,
293
+ "step": 3700
294
+ },
295
+ {
296
+ "epoch": 5.61,
297
+ "learning_rate": 7.077080062794347e-05,
298
+ "loss": 5.6613,
299
+ "step": 3800
300
+ },
301
+ {
302
+ "epoch": 5.76,
303
+ "learning_rate": 7.05353218210361e-05,
304
+ "loss": 5.6579,
305
+ "step": 3900
306
+ },
307
+ {
308
+ "epoch": 5.91,
309
+ "learning_rate": 7.029984301412873e-05,
310
+ "loss": 5.6861,
311
+ "step": 4000
312
+ },
313
+ {
314
+ "epoch": 5.91,
315
+ "eval_loss": 5.517865180969238,
316
+ "eval_runtime": 115.3653,
317
+ "eval_samples_per_second": 17.518,
318
+ "eval_steps_per_second": 2.193,
319
+ "eval_wer": 2.223156853043048,
320
+ "step": 4000
321
+ },
322
+ {
323
+ "epoch": 6.06,
324
+ "learning_rate": 7.006436420722135e-05,
325
+ "loss": 5.6024,
326
+ "step": 4100
327
+ },
328
+ {
329
+ "epoch": 6.2,
330
+ "learning_rate": 6.982888540031396e-05,
331
+ "loss": 5.5497,
332
+ "step": 4200
333
+ },
334
+ {
335
+ "epoch": 6.35,
336
+ "learning_rate": 6.959340659340659e-05,
337
+ "loss": 5.5257,
338
+ "step": 4300
339
+ },
340
+ {
341
+ "epoch": 6.5,
342
+ "learning_rate": 6.93579277864992e-05,
343
+ "loss": 5.4534,
344
+ "step": 4400
345
+ },
346
+ {
347
+ "epoch": 6.65,
348
+ "learning_rate": 6.912244897959182e-05,
349
+ "loss": 5.381,
350
+ "step": 4500
351
+ },
352
+ {
353
+ "epoch": 6.65,
354
+ "eval_loss": 5.142032146453857,
355
+ "eval_runtime": 117.6237,
356
+ "eval_samples_per_second": 17.182,
357
+ "eval_steps_per_second": 2.151,
358
+ "eval_wer": 2.18159327065809,
359
+ "step": 4500
360
+ },
361
+ {
362
+ "epoch": 6.79,
363
+ "learning_rate": 6.888697017268445e-05,
364
+ "loss": 5.3409,
365
+ "step": 4600
366
+ },
367
+ {
368
+ "epoch": 6.94,
369
+ "learning_rate": 6.865149136577708e-05,
370
+ "loss": 5.1283,
371
+ "step": 4700
372
+ },
373
+ {
374
+ "epoch": 7.09,
375
+ "learning_rate": 6.841601255886969e-05,
376
+ "loss": 4.8788,
377
+ "step": 4800
378
+ },
379
+ {
380
+ "epoch": 7.24,
381
+ "learning_rate": 6.818053375196232e-05,
382
+ "loss": 4.7235,
383
+ "step": 4900
384
+ },
385
+ {
386
+ "epoch": 7.39,
387
+ "learning_rate": 6.794505494505494e-05,
388
+ "loss": 4.625,
389
+ "step": 5000
390
+ },
391
+ {
392
+ "epoch": 7.39,
393
+ "eval_loss": 3.9019837379455566,
394
+ "eval_runtime": 116.0971,
395
+ "eval_samples_per_second": 17.408,
396
+ "eval_steps_per_second": 2.179,
397
+ "eval_wer": 2.0722414646214746,
398
+ "step": 5000
399
+ },
400
+ {
401
+ "epoch": 7.53,
402
+ "learning_rate": 6.770957613814756e-05,
403
+ "loss": 4.5404,
404
+ "step": 5100
405
+ },
406
+ {
407
+ "epoch": 7.68,
408
+ "learning_rate": 6.747409733124018e-05,
409
+ "loss": 4.4307,
410
+ "step": 5200
411
+ },
412
+ {
413
+ "epoch": 7.83,
414
+ "learning_rate": 6.723861852433281e-05,
415
+ "loss": 4.3794,
416
+ "step": 5300
417
+ },
418
+ {
419
+ "epoch": 7.98,
420
+ "learning_rate": 6.700313971742542e-05,
421
+ "loss": 4.2786,
422
+ "step": 5400
423
+ },
424
+ {
425
+ "epoch": 8.12,
426
+ "learning_rate": 6.676766091051805e-05,
427
+ "loss": 4.214,
428
+ "step": 5500
429
+ },
430
+ {
431
+ "epoch": 8.12,
432
+ "eval_loss": 3.339416027069092,
433
+ "eval_runtime": 116.9868,
434
+ "eval_samples_per_second": 17.275,
435
+ "eval_steps_per_second": 2.163,
436
+ "eval_wer": 2.1429985155863434,
437
+ "step": 5500
438
+ },
439
+ {
440
+ "epoch": 8.27,
441
+ "learning_rate": 6.653218210361068e-05,
442
+ "loss": 4.1206,
443
+ "step": 5600
444
+ },
445
+ {
446
+ "epoch": 8.42,
447
+ "learning_rate": 6.62967032967033e-05,
448
+ "loss": 4.081,
449
+ "step": 5700
450
+ },
451
+ {
452
+ "epoch": 8.57,
453
+ "learning_rate": 6.606122448979591e-05,
454
+ "loss": 4.0059,
455
+ "step": 5800
456
+ },
457
+ {
458
+ "epoch": 8.71,
459
+ "learning_rate": 6.582574568288854e-05,
460
+ "loss": 3.9251,
461
+ "step": 5900
462
+ },
463
+ {
464
+ "epoch": 8.86,
465
+ "learning_rate": 6.559262166405023e-05,
466
+ "loss": 3.8992,
467
+ "step": 6000
468
+ },
469
+ {
470
+ "epoch": 8.86,
471
+ "eval_loss": 2.9084665775299072,
472
+ "eval_runtime": 119.0907,
473
+ "eval_samples_per_second": 16.97,
474
+ "eval_steps_per_second": 2.124,
475
+ "eval_wer": 2.153389411182583,
476
+ "step": 6000
477
+ },
478
+ {
479
+ "epoch": 9.01,
480
+ "learning_rate": 6.535714285714285e-05,
481
+ "loss": 3.8494,
482
+ "step": 6100
483
+ },
484
+ {
485
+ "epoch": 9.16,
486
+ "learning_rate": 6.512166405023547e-05,
487
+ "loss": 3.7923,
488
+ "step": 6200
489
+ },
490
+ {
491
+ "epoch": 9.31,
492
+ "learning_rate": 6.48861852433281e-05,
493
+ "loss": 3.7416,
494
+ "step": 6300
495
+ },
496
+ {
497
+ "epoch": 9.45,
498
+ "learning_rate": 6.465070643642071e-05,
499
+ "loss": 3.7095,
500
+ "step": 6400
501
+ },
502
+ {
503
+ "epoch": 9.6,
504
+ "learning_rate": 6.441522762951334e-05,
505
+ "loss": 3.6481,
506
+ "step": 6500
507
+ },
508
+ {
509
+ "epoch": 9.6,
510
+ "eval_loss": 2.620758295059204,
511
+ "eval_runtime": 115.2407,
512
+ "eval_samples_per_second": 17.537,
513
+ "eval_steps_per_second": 2.195,
514
+ "eval_wer": 2.3537852548243445,
515
+ "step": 6500
516
+ },
517
+ {
518
+ "epoch": 9.75,
519
+ "learning_rate": 6.417974882260596e-05,
520
+ "loss": 3.6196,
521
+ "step": 6600
522
+ },
523
+ {
524
+ "epoch": 9.9,
525
+ "learning_rate": 6.394427001569859e-05,
526
+ "loss": 3.5941,
527
+ "step": 6700
528
+ },
529
+ {
530
+ "epoch": 10.04,
531
+ "learning_rate": 6.37087912087912e-05,
532
+ "loss": 3.5608,
533
+ "step": 6800
534
+ },
535
+ {
536
+ "epoch": 10.19,
537
+ "learning_rate": 6.347331240188383e-05,
538
+ "loss": 3.5296,
539
+ "step": 6900
540
+ },
541
+ {
542
+ "epoch": 10.34,
543
+ "learning_rate": 6.324018838304552e-05,
544
+ "loss": 3.4658,
545
+ "step": 7000
546
+ },
547
+ {
548
+ "epoch": 10.34,
549
+ "eval_loss": 2.3172152042388916,
550
+ "eval_runtime": 114.5436,
551
+ "eval_samples_per_second": 17.644,
552
+ "eval_steps_per_second": 2.209,
553
+ "eval_wer": 2.227115289460663,
554
+ "step": 7000
555
+ },
556
+ {
557
+ "epoch": 10.49,
558
+ "learning_rate": 6.300470957613814e-05,
559
+ "loss": 3.3977,
560
+ "step": 7100
561
+ },
562
+ {
563
+ "epoch": 10.63,
564
+ "learning_rate": 6.276923076923076e-05,
565
+ "loss": 3.3987,
566
+ "step": 7200
567
+ },
568
+ {
569
+ "epoch": 10.78,
570
+ "learning_rate": 6.253375196232339e-05,
571
+ "loss": 3.3587,
572
+ "step": 7300
573
+ },
574
+ {
575
+ "epoch": 10.93,
576
+ "learning_rate": 6.2298273155416e-05,
577
+ "loss": 3.2796,
578
+ "step": 7400
579
+ },
580
+ {
581
+ "epoch": 11.08,
582
+ "learning_rate": 6.206279434850863e-05,
583
+ "loss": 3.257,
584
+ "step": 7500
585
+ },
586
+ {
587
+ "epoch": 11.08,
588
+ "eval_loss": 2.0916049480438232,
589
+ "eval_runtime": 113.9408,
590
+ "eval_samples_per_second": 17.737,
591
+ "eval_steps_per_second": 2.22,
592
+ "eval_wer": 2.1350816427511132,
593
+ "step": 7500
594
+ },
595
+ {
596
+ "epoch": 11.23,
597
+ "learning_rate": 6.182731554160125e-05,
598
+ "loss": 3.2476,
599
+ "step": 7600
600
+ },
601
+ {
602
+ "epoch": 11.37,
603
+ "learning_rate": 6.159183673469388e-05,
604
+ "loss": 3.2463,
605
+ "step": 7700
606
+ },
607
+ {
608
+ "epoch": 11.52,
609
+ "learning_rate": 6.135635792778649e-05,
610
+ "loss": 3.2323,
611
+ "step": 7800
612
+ },
613
+ {
614
+ "epoch": 11.67,
615
+ "learning_rate": 6.112087912087912e-05,
616
+ "loss": 3.1674,
617
+ "step": 7900
618
+ },
619
+ {
620
+ "epoch": 11.82,
621
+ "learning_rate": 6.088540031397174e-05,
622
+ "loss": 3.1294,
623
+ "step": 8000
624
+ },
625
+ {
626
+ "epoch": 11.82,
627
+ "eval_loss": 1.895378828048706,
628
+ "eval_runtime": 115.1394,
629
+ "eval_samples_per_second": 17.553,
630
+ "eval_steps_per_second": 2.197,
631
+ "eval_wer": 2.2132607619990106,
632
+ "step": 8000
633
+ },
634
+ {
635
+ "epoch": 11.96,
636
+ "learning_rate": 6.0649921507064355e-05,
637
+ "loss": 3.1262,
638
+ "step": 8100
639
+ },
640
+ {
641
+ "epoch": 12.11,
642
+ "learning_rate": 6.041444270015698e-05,
643
+ "loss": 3.0377,
644
+ "step": 8200
645
+ },
646
+ {
647
+ "epoch": 12.26,
648
+ "learning_rate": 6.01789638932496e-05,
649
+ "loss": 3.0306,
650
+ "step": 8300
651
+ },
652
+ {
653
+ "epoch": 12.41,
654
+ "learning_rate": 5.994348508634223e-05,
655
+ "loss": 3.0425,
656
+ "step": 8400
657
+ },
658
+ {
659
+ "epoch": 12.56,
660
+ "learning_rate": 5.9710361067503915e-05,
661
+ "loss": 3.0266,
662
+ "step": 8500
663
+ },
664
+ {
665
+ "epoch": 12.56,
666
+ "eval_loss": 1.76727294921875,
667
+ "eval_runtime": 114.3494,
668
+ "eval_samples_per_second": 17.674,
669
+ "eval_steps_per_second": 2.213,
670
+ "eval_wer": 2.0895596239485403,
671
+ "step": 8500
672
+ },
673
+ {
674
+ "epoch": 12.7,
675
+ "learning_rate": 5.9474882260596537e-05,
676
+ "loss": 3.0398,
677
+ "step": 8600
678
+ },
679
+ {
680
+ "epoch": 12.85,
681
+ "learning_rate": 5.9239403453689165e-05,
682
+ "loss": 2.9985,
683
+ "step": 8700
684
+ },
685
+ {
686
+ "epoch": 13.0,
687
+ "learning_rate": 5.900392464678179e-05,
688
+ "loss": 2.9969,
689
+ "step": 8800
690
+ },
691
+ {
692
+ "epoch": 13.15,
693
+ "learning_rate": 5.876844583987441e-05,
694
+ "loss": 2.9648,
695
+ "step": 8900
696
+ },
697
+ {
698
+ "epoch": 13.29,
699
+ "learning_rate": 5.8532967032967024e-05,
700
+ "loss": 2.9451,
701
+ "step": 9000
702
+ },
703
+ {
704
+ "epoch": 13.29,
705
+ "eval_loss": 1.665855884552002,
706
+ "eval_runtime": 116.4877,
707
+ "eval_samples_per_second": 17.349,
708
+ "eval_steps_per_second": 2.172,
709
+ "eval_wer": 2.1380504700643246,
710
+ "step": 9000
711
+ },
712
+ {
713
+ "epoch": 13.44,
714
+ "learning_rate": 5.8297488226059645e-05,
715
+ "loss": 2.9573,
716
+ "step": 9100
717
+ },
718
+ {
719
+ "epoch": 13.59,
720
+ "learning_rate": 5.8062009419152274e-05,
721
+ "loss": 2.8819,
722
+ "step": 9200
723
+ },
724
+ {
725
+ "epoch": 13.74,
726
+ "learning_rate": 5.7826530612244896e-05,
727
+ "loss": 2.8901,
728
+ "step": 9300
729
+ },
730
+ {
731
+ "epoch": 13.88,
732
+ "learning_rate": 5.759105180533752e-05,
733
+ "loss": 2.8492,
734
+ "step": 9400
735
+ },
736
+ {
737
+ "epoch": 14.03,
738
+ "learning_rate": 5.735557299843013e-05,
739
+ "loss": 2.8802,
740
+ "step": 9500
741
+ },
742
+ {
743
+ "epoch": 14.03,
744
+ "eval_loss": 1.5637215375900269,
745
+ "eval_runtime": 115.1622,
746
+ "eval_samples_per_second": 17.549,
747
+ "eval_steps_per_second": 2.197,
748
+ "eval_wer": 2.1969322117763483,
749
+ "step": 9500
750
+ },
751
+ {
752
+ "epoch": 14.18,
753
+ "learning_rate": 5.7120094191522754e-05,
754
+ "loss": 2.8346,
755
+ "step": 9600
756
+ },
757
+ {
758
+ "epoch": 14.33,
759
+ "learning_rate": 5.6884615384615376e-05,
760
+ "loss": 2.8355,
761
+ "step": 9700
762
+ },
763
+ {
764
+ "epoch": 14.48,
765
+ "learning_rate": 5.6649136577708005e-05,
766
+ "loss": 2.8124,
767
+ "step": 9800
768
+ },
769
+ {
770
+ "epoch": 14.62,
771
+ "learning_rate": 5.6413657770800626e-05,
772
+ "loss": 2.7879,
773
+ "step": 9900
774
+ },
775
+ {
776
+ "epoch": 14.77,
777
+ "learning_rate": 5.617817896389324e-05,
778
+ "loss": 2.78,
779
+ "step": 10000
780
+ },
781
+ {
782
+ "epoch": 14.77,
783
+ "eval_loss": 1.4921427965164185,
784
+ "eval_runtime": 115.1,
785
+ "eval_samples_per_second": 17.559,
786
+ "eval_steps_per_second": 2.198,
787
+ "eval_wer": 2.2335477486392876,
788
+ "step": 10000
789
+ },
790
+ {
791
+ "epoch": 14.92,
792
+ "learning_rate": 5.594270015698586e-05,
793
+ "loss": 2.775,
794
+ "step": 10100
795
+ },
796
+ {
797
+ "epoch": 15.07,
798
+ "learning_rate": 5.5707221350078485e-05,
799
+ "loss": 2.7478,
800
+ "step": 10200
801
+ },
802
+ {
803
+ "epoch": 15.21,
804
+ "learning_rate": 5.5471742543171114e-05,
805
+ "loss": 2.7224,
806
+ "step": 10300
807
+ },
808
+ {
809
+ "epoch": 15.36,
810
+ "learning_rate": 5.5236263736263735e-05,
811
+ "loss": 2.7506,
812
+ "step": 10400
813
+ },
814
+ {
815
+ "epoch": 15.51,
816
+ "learning_rate": 5.500078492935635e-05,
817
+ "loss": 2.7049,
818
+ "step": 10500
819
+ },
820
+ {
821
+ "epoch": 15.51,
822
+ "eval_loss": 1.413183569908142,
823
+ "eval_runtime": 114.2743,
824
+ "eval_samples_per_second": 17.686,
825
+ "eval_steps_per_second": 2.214,
826
+ "eval_wer": 2.221672439386442,
827
+ "step": 10500
828
+ },
829
+ {
830
+ "epoch": 15.66,
831
+ "learning_rate": 5.476766091051805e-05,
832
+ "loss": 2.7145,
833
+ "step": 10600
834
+ },
835
+ {
836
+ "epoch": 15.8,
837
+ "learning_rate": 5.453218210361067e-05,
838
+ "loss": 2.6892,
839
+ "step": 10700
840
+ },
841
+ {
842
+ "epoch": 15.95,
843
+ "learning_rate": 5.4296703296703295e-05,
844
+ "loss": 2.69,
845
+ "step": 10800
846
+ },
847
+ {
848
+ "epoch": 16.1,
849
+ "learning_rate": 5.406122448979591e-05,
850
+ "loss": 2.623,
851
+ "step": 10900
852
+ },
853
+ {
854
+ "epoch": 16.25,
855
+ "learning_rate": 5.382574568288853e-05,
856
+ "loss": 2.6768,
857
+ "step": 11000
858
+ },
859
+ {
860
+ "epoch": 16.25,
861
+ "eval_loss": 1.3666878938674927,
862
+ "eval_runtime": 119.4402,
863
+ "eval_samples_per_second": 16.921,
864
+ "eval_steps_per_second": 2.118,
865
+ "eval_wer": 2.223156853043048,
866
+ "step": 11000
867
+ },
868
+ {
869
+ "epoch": 16.4,
870
+ "learning_rate": 5.359262166405023e-05,
871
+ "loss": 2.628,
872
+ "step": 11100
873
+ },
874
+ {
875
+ "epoch": 16.54,
876
+ "learning_rate": 5.3357142857142854e-05,
877
+ "loss": 2.6163,
878
+ "step": 11200
879
+ },
880
+ {
881
+ "epoch": 16.69,
882
+ "learning_rate": 5.312166405023547e-05,
883
+ "loss": 2.6193,
884
+ "step": 11300
885
+ },
886
+ {
887
+ "epoch": 16.84,
888
+ "learning_rate": 5.28861852433281e-05,
889
+ "loss": 2.6531,
890
+ "step": 11400
891
+ },
892
+ {
893
+ "epoch": 16.99,
894
+ "learning_rate": 5.265070643642072e-05,
895
+ "loss": 2.6358,
896
+ "step": 11500
897
+ },
898
+ {
899
+ "epoch": 16.99,
900
+ "eval_loss": 1.311090111732483,
901
+ "eval_runtime": 116.2157,
902
+ "eval_samples_per_second": 17.39,
903
+ "eval_steps_per_second": 2.177,
904
+ "eval_wer": 2.128649183572489,
905
+ "step": 11500
906
+ },
907
+ {
908
+ "epoch": 17.13,
909
+ "learning_rate": 5.241522762951334e-05,
910
+ "loss": 2.5748,
911
+ "step": 11600
912
+ },
913
+ {
914
+ "epoch": 17.28,
915
+ "learning_rate": 5.217974882260596e-05,
916
+ "loss": 2.6287,
917
+ "step": 11700
918
+ },
919
+ {
920
+ "epoch": 17.43,
921
+ "learning_rate": 5.194427001569858e-05,
922
+ "loss": 2.5583,
923
+ "step": 11800
924
+ },
925
+ {
926
+ "epoch": 17.58,
927
+ "learning_rate": 5.17087912087912e-05,
928
+ "loss": 2.5547,
929
+ "step": 11900
930
+ },
931
+ {
932
+ "epoch": 17.72,
933
+ "learning_rate": 5.147331240188383e-05,
934
+ "loss": 2.5802,
935
+ "step": 12000
936
+ },
937
+ {
938
+ "epoch": 17.72,
939
+ "eval_loss": 1.2678567171096802,
940
+ "eval_runtime": 116.076,
941
+ "eval_samples_per_second": 17.411,
942
+ "eval_steps_per_second": 2.18,
943
+ "eval_wer": 2.1429985155863434,
944
+ "step": 12000
945
+ },
946
+ {
947
+ "epoch": 17.87,
948
+ "learning_rate": 5.123783359497645e-05,
949
+ "loss": 2.557,
950
+ "step": 12100
951
+ },
952
+ {
953
+ "epoch": 18.02,
954
+ "learning_rate": 5.100235478806907e-05,
955
+ "loss": 2.5771,
956
+ "step": 12200
957
+ },
958
+ {
959
+ "epoch": 18.17,
960
+ "learning_rate": 5.076687598116169e-05,
961
+ "loss": 2.5393,
962
+ "step": 12300
963
+ },
964
+ {
965
+ "epoch": 18.32,
966
+ "learning_rate": 5.053375196232339e-05,
967
+ "loss": 2.5031,
968
+ "step": 12400
969
+ },
970
+ {
971
+ "epoch": 18.46,
972
+ "learning_rate": 5.029827315541601e-05,
973
+ "loss": 2.5012,
974
+ "step": 12500
975
+ },
976
+ {
977
+ "epoch": 18.46,
978
+ "eval_loss": 1.2365446090698242,
979
+ "eval_runtime": 116.0118,
980
+ "eval_samples_per_second": 17.421,
981
+ "eval_steps_per_second": 2.181,
982
+ "eval_wer": 2.115289460663038,
983
+ "step": 12500
984
+ },
985
+ {
986
+ "epoch": 18.61,
987
+ "learning_rate": 5.006279434850863e-05,
988
+ "loss": 2.54,
989
+ "step": 12600
990
+ },
991
+ {
992
+ "epoch": 18.76,
993
+ "learning_rate": 4.9827315541601246e-05,
994
+ "loss": 2.5072,
995
+ "step": 12700
996
+ },
997
+ {
998
+ "epoch": 18.91,
999
+ "learning_rate": 4.9591836734693875e-05,
1000
+ "loss": 2.4951,
1001
+ "step": 12800
1002
+ },
1003
+ {
1004
+ "epoch": 19.05,
1005
+ "learning_rate": 4.9356357927786497e-05,
1006
+ "loss": 2.4789,
1007
+ "step": 12900
1008
+ },
1009
+ {
1010
+ "epoch": 19.2,
1011
+ "learning_rate": 4.912087912087912e-05,
1012
+ "loss": 2.458,
1013
+ "step": 13000
1014
+ },
1015
+ {
1016
+ "epoch": 19.2,
1017
+ "eval_loss": 1.2117862701416016,
1018
+ "eval_runtime": 116.2579,
1019
+ "eval_samples_per_second": 17.384,
1020
+ "eval_steps_per_second": 2.176,
1021
+ "eval_wer": 2.1573478476001977,
1022
+ "step": 13000
1023
+ },
1024
+ {
1025
+ "epoch": 19.35,
1026
+ "learning_rate": 4.888540031397174e-05,
1027
+ "loss": 2.4616,
1028
+ "step": 13100
1029
+ },
1030
+ {
1031
+ "epoch": 19.5,
1032
+ "learning_rate": 4.8649921507064355e-05,
1033
+ "loss": 2.4739,
1034
+ "step": 13200
1035
+ },
1036
+ {
1037
+ "epoch": 19.65,
1038
+ "learning_rate": 4.8414442700156984e-05,
1039
+ "loss": 2.4867,
1040
+ "step": 13300
1041
+ },
1042
+ {
1043
+ "epoch": 19.79,
1044
+ "learning_rate": 4.8178963893249605e-05,
1045
+ "loss": 2.4568,
1046
+ "step": 13400
1047
+ },
1048
+ {
1049
+ "epoch": 19.94,
1050
+ "learning_rate": 4.794348508634223e-05,
1051
+ "loss": 2.4433,
1052
+ "step": 13500
1053
+ },
1054
+ {
1055
+ "epoch": 19.94,
1056
+ "eval_loss": 1.1991767883300781,
1057
+ "eval_runtime": 114.5641,
1058
+ "eval_samples_per_second": 17.641,
1059
+ "eval_steps_per_second": 2.208,
1060
+ "eval_wer": 2.1335972290945078,
1061
+ "step": 13500
1062
+ },
1063
+ {
1064
+ "epoch": 20.09,
1065
+ "learning_rate": 4.770800627943485e-05,
1066
+ "loss": 2.4532,
1067
+ "step": 13600
1068
+ },
1069
+ {
1070
+ "epoch": 20.24,
1071
+ "learning_rate": 4.7472527472527464e-05,
1072
+ "loss": 2.3913,
1073
+ "step": 13700
1074
+ },
1075
+ {
1076
+ "epoch": 20.38,
1077
+ "learning_rate": 4.7237048665620086e-05,
1078
+ "loss": 2.421,
1079
+ "step": 13800
1080
+ },
1081
+ {
1082
+ "epoch": 20.53,
1083
+ "learning_rate": 4.7001569858712714e-05,
1084
+ "loss": 2.4526,
1085
+ "step": 13900
1086
+ },
1087
+ {
1088
+ "epoch": 20.68,
1089
+ "learning_rate": 4.6766091051805336e-05,
1090
+ "loss": 2.438,
1091
+ "step": 14000
1092
+ },
1093
+ {
1094
+ "epoch": 20.68,
1095
+ "eval_loss": 1.180332064628601,
1096
+ "eval_runtime": 116.5012,
1097
+ "eval_samples_per_second": 17.347,
1098
+ "eval_steps_per_second": 2.172,
1099
+ "eval_wer": 2.1509153884215735,
1100
+ "step": 14000
1101
+ },
1102
+ {
1103
+ "epoch": 20.83,
1104
+ "learning_rate": 4.653061224489796e-05,
1105
+ "loss": 2.4034,
1106
+ "step": 14100
1107
+ },
1108
+ {
1109
+ "epoch": 20.97,
1110
+ "learning_rate": 4.629513343799057e-05,
1111
+ "loss": 2.4306,
1112
+ "step": 14200
1113
+ },
1114
+ {
1115
+ "epoch": 21.12,
1116
+ "learning_rate": 4.6059654631083195e-05,
1117
+ "loss": 2.4145,
1118
+ "step": 14300
1119
+ },
1120
+ {
1121
+ "epoch": 21.27,
1122
+ "learning_rate": 4.582417582417582e-05,
1123
+ "loss": 2.4677,
1124
+ "step": 14400
1125
+ },
1126
+ {
1127
+ "epoch": 21.42,
1128
+ "learning_rate": 4.5588697017268445e-05,
1129
+ "loss": 2.418,
1130
+ "step": 14500
1131
+ },
1132
+ {
1133
+ "epoch": 21.42,
1134
+ "eval_loss": 1.1601430177688599,
1135
+ "eval_runtime": 114.5652,
1136
+ "eval_samples_per_second": 17.641,
1137
+ "eval_steps_per_second": 2.208,
1138
+ "eval_wer": 2.1232063334982683,
1139
+ "step": 14500
1140
+ },
1141
+ {
1142
+ "epoch": 21.57,
1143
+ "learning_rate": 4.535321821036107e-05,
1144
+ "loss": 2.3967,
1145
+ "step": 14600
1146
+ },
1147
+ {
1148
+ "epoch": 21.71,
1149
+ "learning_rate": 4.511773940345368e-05,
1150
+ "loss": 2.3939,
1151
+ "step": 14700
1152
+ },
1153
+ {
1154
+ "epoch": 21.86,
1155
+ "learning_rate": 4.4882260596546304e-05,
1156
+ "loss": 2.3925,
1157
+ "step": 14800
1158
+ },
1159
+ {
1160
+ "epoch": 22.01,
1161
+ "learning_rate": 4.4646781789638925e-05,
1162
+ "loss": 2.3596,
1163
+ "step": 14900
1164
+ },
1165
+ {
1166
+ "epoch": 22.16,
1167
+ "learning_rate": 4.4411302982731554e-05,
1168
+ "loss": 2.3322,
1169
+ "step": 15000
1170
+ },
1171
+ {
1172
+ "epoch": 22.16,
1173
+ "eval_loss": 1.1417704820632935,
1174
+ "eval_runtime": 116.2111,
1175
+ "eval_samples_per_second": 17.391,
1176
+ "eval_steps_per_second": 2.177,
1177
+ "eval_wer": 2.1929737753587335,
1178
+ "step": 15000
1179
+ },
1180
+ {
1181
+ "epoch": 22.3,
1182
+ "learning_rate": 4.4175824175824176e-05,
1183
+ "loss": 2.3821,
1184
+ "step": 15100
1185
+ },
1186
+ {
1187
+ "epoch": 22.45,
1188
+ "learning_rate": 4.394034536891679e-05,
1189
+ "loss": 2.3435,
1190
+ "step": 15200
1191
+ },
1192
+ {
1193
+ "epoch": 22.6,
1194
+ "learning_rate": 4.370486656200941e-05,
1195
+ "loss": 2.3542,
1196
+ "step": 15300
1197
+ },
1198
+ {
1199
+ "epoch": 22.75,
1200
+ "learning_rate": 4.3469387755102034e-05,
1201
+ "loss": 2.3469,
1202
+ "step": 15400
1203
+ },
1204
+ {
1205
+ "epoch": 22.89,
1206
+ "learning_rate": 4.323390894819466e-05,
1207
+ "loss": 2.3387,
1208
+ "step": 15500
1209
+ },
1210
+ {
1211
+ "epoch": 22.89,
1212
+ "eval_loss": 1.1172302961349487,
1213
+ "eval_runtime": 114.3169,
1214
+ "eval_samples_per_second": 17.679,
1215
+ "eval_steps_per_second": 2.213,
1216
+ "eval_wer": 2.2464126669965365,
1217
+ "step": 15500
1218
+ },
1219
+ {
1220
+ "epoch": 23.04,
1221
+ "learning_rate": 4.2998430141287285e-05,
1222
+ "loss": 2.3688,
1223
+ "step": 15600
1224
+ },
1225
+ {
1226
+ "epoch": 23.19,
1227
+ "learning_rate": 4.27629513343799e-05,
1228
+ "loss": 2.3344,
1229
+ "step": 15700
1230
+ },
1231
+ {
1232
+ "epoch": 23.34,
1233
+ "learning_rate": 4.252747252747252e-05,
1234
+ "loss": 2.3245,
1235
+ "step": 15800
1236
+ },
1237
+ {
1238
+ "epoch": 23.49,
1239
+ "learning_rate": 4.229199372056514e-05,
1240
+ "loss": 2.3523,
1241
+ "step": 15900
1242
+ },
1243
+ {
1244
+ "epoch": 23.63,
1245
+ "learning_rate": 4.205651491365777e-05,
1246
+ "loss": 2.3349,
1247
+ "step": 16000
1248
+ },
1249
+ {
1250
+ "epoch": 23.63,
1251
+ "eval_loss": 1.1144375801086426,
1252
+ "eval_runtime": 116.2412,
1253
+ "eval_samples_per_second": 17.386,
1254
+ "eval_steps_per_second": 2.177,
1255
+ "eval_wer": 2.185551707075705,
1256
+ "step": 16000
1257
+ },
1258
+ {
1259
+ "epoch": 23.78,
1260
+ "learning_rate": 4.1821036106750393e-05,
1261
+ "loss": 2.2847,
1262
+ "step": 16100
1263
+ },
1264
+ {
1265
+ "epoch": 23.93,
1266
+ "learning_rate": 4.158555729984301e-05,
1267
+ "loss": 2.3303,
1268
+ "step": 16200
1269
+ },
1270
+ {
1271
+ "epoch": 24.08,
1272
+ "learning_rate": 4.135007849293563e-05,
1273
+ "loss": 2.2994,
1274
+ "step": 16300
1275
+ },
1276
+ {
1277
+ "epoch": 24.22,
1278
+ "learning_rate": 4.111459968602825e-05,
1279
+ "loss": 2.2887,
1280
+ "step": 16400
1281
+ },
1282
+ {
1283
+ "epoch": 24.37,
1284
+ "learning_rate": 4.0879120879120874e-05,
1285
+ "loss": 2.291,
1286
+ "step": 16500
1287
+ },
1288
+ {
1289
+ "epoch": 24.37,
1290
+ "eval_loss": 1.1018128395080566,
1291
+ "eval_runtime": 114.9042,
1292
+ "eval_samples_per_second": 17.589,
1293
+ "eval_steps_per_second": 2.202,
1294
+ "eval_wer": 2.1929737753587335,
1295
+ "step": 16500
1296
+ },
1297
+ {
1298
+ "epoch": 24.52,
1299
+ "learning_rate": 4.06436420722135e-05,
1300
+ "loss": 2.2888,
1301
+ "step": 16600
1302
+ },
1303
+ {
1304
+ "epoch": 24.67,
1305
+ "learning_rate": 4.040816326530612e-05,
1306
+ "loss": 2.2724,
1307
+ "step": 16700
1308
+ },
1309
+ {
1310
+ "epoch": 24.82,
1311
+ "learning_rate": 4.017268445839874e-05,
1312
+ "loss": 2.2922,
1313
+ "step": 16800
1314
+ },
1315
+ {
1316
+ "epoch": 24.96,
1317
+ "learning_rate": 3.993720565149136e-05,
1318
+ "loss": 2.2934,
1319
+ "step": 16900
1320
+ },
1321
+ {
1322
+ "epoch": 25.11,
1323
+ "learning_rate": 3.970172684458398e-05,
1324
+ "loss": 2.2766,
1325
+ "step": 17000
1326
+ },
1327
+ {
1328
+ "epoch": 25.11,
1329
+ "eval_loss": 1.0882744789123535,
1330
+ "eval_runtime": 117.2941,
1331
+ "eval_samples_per_second": 17.23,
1332
+ "eval_steps_per_second": 2.157,
1333
+ "eval_wer": 2.1761504205838693,
1334
+ "step": 17000
1335
+ },
1336
+ {
1337
+ "epoch": 25.26,
1338
+ "learning_rate": 3.946624803767661e-05,
1339
+ "loss": 2.2656,
1340
+ "step": 17100
1341
+ },
1342
+ {
1343
+ "epoch": 25.41,
1344
+ "learning_rate": 3.9230769230769226e-05,
1345
+ "loss": 2.2929,
1346
+ "step": 17200
1347
+ },
1348
+ {
1349
+ "epoch": 25.55,
1350
+ "learning_rate": 3.899529042386185e-05,
1351
+ "loss": 2.2513,
1352
+ "step": 17300
1353
+ },
1354
+ {
1355
+ "epoch": 25.7,
1356
+ "learning_rate": 3.875981161695447e-05,
1357
+ "loss": 2.2603,
1358
+ "step": 17400
1359
+ },
1360
+ {
1361
+ "epoch": 25.85,
1362
+ "learning_rate": 3.852433281004709e-05,
1363
+ "loss": 2.2534,
1364
+ "step": 17500
1365
+ },
1366
+ {
1367
+ "epoch": 25.85,
1368
+ "eval_loss": 1.0743526220321655,
1369
+ "eval_runtime": 118.2043,
1370
+ "eval_samples_per_second": 17.098,
1371
+ "eval_steps_per_second": 2.14,
1372
+ "eval_wer": 2.1875309252845128,
1373
+ "step": 17500
1374
+ },
1375
+ {
1376
+ "epoch": 26.0,
1377
+ "learning_rate": 3.8288854003139713e-05,
1378
+ "loss": 2.2716,
1379
+ "step": 17600
1380
+ },
1381
+ {
1382
+ "epoch": 26.14,
1383
+ "learning_rate": 3.8053375196232335e-05,
1384
+ "loss": 2.2486,
1385
+ "step": 17700
1386
+ },
1387
+ {
1388
+ "epoch": 26.29,
1389
+ "learning_rate": 3.781789638932496e-05,
1390
+ "loss": 2.2068,
1391
+ "step": 17800
1392
+ },
1393
+ {
1394
+ "epoch": 26.44,
1395
+ "learning_rate": 3.758241758241758e-05,
1396
+ "loss": 2.2431,
1397
+ "step": 17900
1398
+ },
1399
+ {
1400
+ "epoch": 26.59,
1401
+ "learning_rate": 3.73469387755102e-05,
1402
+ "loss": 2.2393,
1403
+ "step": 18000
1404
+ },
1405
+ {
1406
+ "epoch": 26.59,
1407
+ "eval_loss": 1.0561192035675049,
1408
+ "eval_runtime": 116.8996,
1409
+ "eval_samples_per_second": 17.288,
1410
+ "eval_steps_per_second": 2.164,
1411
+ "eval_wer": 2.1845620979713014,
1412
+ "step": 18000
1413
+ },
1414
+ {
1415
+ "epoch": 26.74,
1416
+ "learning_rate": 3.711145996860282e-05,
1417
+ "loss": 2.1944,
1418
+ "step": 18100
1419
+ },
1420
+ {
1421
+ "epoch": 26.88,
1422
+ "learning_rate": 3.6875981161695444e-05,
1423
+ "loss": 2.2359,
1424
+ "step": 18200
1425
+ },
1426
+ {
1427
+ "epoch": 27.03,
1428
+ "learning_rate": 3.664285714285714e-05,
1429
+ "loss": 2.2097,
1430
+ "step": 18300
1431
+ },
1432
+ {
1433
+ "epoch": 27.18,
1434
+ "learning_rate": 3.640737833594976e-05,
1435
+ "loss": 2.1431,
1436
+ "step": 18400
1437
+ },
1438
+ {
1439
+ "epoch": 27.33,
1440
+ "learning_rate": 3.617189952904238e-05,
1441
+ "loss": 2.2085,
1442
+ "step": 18500
1443
+ },
1444
+ {
1445
+ "epoch": 27.33,
1446
+ "eval_loss": 1.0465816259384155,
1447
+ "eval_runtime": 115.87,
1448
+ "eval_samples_per_second": 17.442,
1449
+ "eval_steps_per_second": 2.183,
1450
+ "eval_wer": 2.1444829292429493,
1451
+ "step": 18500
1452
+ },
1453
+ {
1454
+ "epoch": 27.47,
1455
+ "learning_rate": 3.5936420722135003e-05,
1456
+ "loss": 2.2204,
1457
+ "step": 18600
1458
+ },
1459
+ {
1460
+ "epoch": 27.62,
1461
+ "learning_rate": 3.5700941915227625e-05,
1462
+ "loss": 2.242,
1463
+ "step": 18700
1464
+ },
1465
+ {
1466
+ "epoch": 27.77,
1467
+ "learning_rate": 3.546546310832025e-05,
1468
+ "loss": 2.1699,
1469
+ "step": 18800
1470
+ },
1471
+ {
1472
+ "epoch": 27.92,
1473
+ "learning_rate": 3.522998430141287e-05,
1474
+ "loss": 2.2152,
1475
+ "step": 18900
1476
+ },
1477
+ {
1478
+ "epoch": 28.06,
1479
+ "learning_rate": 3.499450549450549e-05,
1480
+ "loss": 2.1966,
1481
+ "step": 19000
1482
+ },
1483
+ {
1484
+ "epoch": 28.06,
1485
+ "eval_loss": 1.0382250547409058,
1486
+ "eval_runtime": 116.4655,
1487
+ "eval_samples_per_second": 17.353,
1488
+ "eval_steps_per_second": 2.172,
1489
+ "eval_wer": 2.1088570014844135,
1490
+ "step": 19000
1491
+ },
1492
+ {
1493
+ "epoch": 28.21,
1494
+ "learning_rate": 3.475902668759811e-05,
1495
+ "loss": 2.169,
1496
+ "step": 19100
1497
+ },
1498
+ {
1499
+ "epoch": 28.36,
1500
+ "learning_rate": 3.4523547880690734e-05,
1501
+ "loss": 2.1981,
1502
+ "step": 19200
1503
+ },
1504
+ {
1505
+ "epoch": 28.51,
1506
+ "learning_rate": 3.4288069073783356e-05,
1507
+ "loss": 2.1692,
1508
+ "step": 19300
1509
+ },
1510
+ {
1511
+ "epoch": 28.66,
1512
+ "learning_rate": 3.405259026687598e-05,
1513
+ "loss": 2.1931,
1514
+ "step": 19400
1515
+ },
1516
+ {
1517
+ "epoch": 28.8,
1518
+ "learning_rate": 3.38171114599686e-05,
1519
+ "loss": 2.1794,
1520
+ "step": 19500
1521
+ },
1522
+ {
1523
+ "epoch": 28.8,
1524
+ "eval_loss": 1.0263785123825073,
1525
+ "eval_runtime": 114.5988,
1526
+ "eval_samples_per_second": 17.635,
1527
+ "eval_steps_per_second": 2.208,
1528
+ "eval_wer": 1.9861454725383474,
1529
+ "step": 19500
1530
+ },
1531
+ {
1532
+ "epoch": 28.95,
1533
+ "learning_rate": 3.358163265306122e-05,
1534
+ "loss": 2.1638,
1535
+ "step": 19600
1536
+ },
1537
+ {
1538
+ "epoch": 29.1,
1539
+ "learning_rate": 3.334615384615384e-05,
1540
+ "loss": 2.1714,
1541
+ "step": 19700
1542
+ },
1543
+ {
1544
+ "epoch": 29.25,
1545
+ "learning_rate": 3.3110675039246465e-05,
1546
+ "loss": 2.1514,
1547
+ "step": 19800
1548
+ },
1549
+ {
1550
+ "epoch": 29.39,
1551
+ "learning_rate": 3.2875196232339087e-05,
1552
+ "loss": 2.1374,
1553
+ "step": 19900
1554
+ },
1555
+ {
1556
+ "epoch": 29.54,
1557
+ "learning_rate": 3.263971742543171e-05,
1558
+ "loss": 2.1423,
1559
+ "step": 20000
1560
+ },
1561
+ {
1562
+ "epoch": 29.54,
1563
+ "eval_loss": 1.0245550870895386,
1564
+ "eval_runtime": 116.8375,
1565
+ "eval_samples_per_second": 17.298,
1566
+ "eval_steps_per_second": 2.165,
1567
+ "eval_wer": 1.9678377041068777,
1568
+ "step": 20000
1569
+ },
1570
+ {
1571
+ "epoch": 29.69,
1572
+ "learning_rate": 3.240423861852433e-05,
1573
+ "loss": 2.1807,
1574
+ "step": 20100
1575
+ },
1576
+ {
1577
+ "epoch": 29.84,
1578
+ "learning_rate": 3.216875981161695e-05,
1579
+ "loss": 2.1545,
1580
+ "step": 20200
1581
+ },
1582
+ {
1583
+ "epoch": 29.98,
1584
+ "learning_rate": 3.1933281004709574e-05,
1585
+ "loss": 2.1404,
1586
+ "step": 20300
1587
+ },
1588
+ {
1589
+ "epoch": 30.13,
1590
+ "learning_rate": 3.1697802197802195e-05,
1591
+ "loss": 2.1089,
1592
+ "step": 20400
1593
+ },
1594
+ {
1595
+ "epoch": 30.28,
1596
+ "learning_rate": 3.146232339089482e-05,
1597
+ "loss": 2.1649,
1598
+ "step": 20500
1599
+ },
1600
+ {
1601
+ "epoch": 30.28,
1602
+ "eval_loss": 0.9981661438941956,
1603
+ "eval_runtime": 116.056,
1604
+ "eval_samples_per_second": 17.414,
1605
+ "eval_steps_per_second": 2.18,
1606
+ "eval_wer": 2.000494804552202,
1607
+ "step": 20500
1608
+ },
1609
+ {
1610
+ "epoch": 30.43,
1611
+ "learning_rate": 3.122684458398744e-05,
1612
+ "loss": 2.1425,
1613
+ "step": 20600
1614
+ },
1615
+ {
1616
+ "epoch": 30.58,
1617
+ "learning_rate": 3.099136577708006e-05,
1618
+ "loss": 2.1357,
1619
+ "step": 20700
1620
+ },
1621
+ {
1622
+ "epoch": 30.72,
1623
+ "learning_rate": 3.0758241758241755e-05,
1624
+ "loss": 2.1251,
1625
+ "step": 20800
1626
+ },
1627
+ {
1628
+ "epoch": 30.87,
1629
+ "learning_rate": 3.052276295133438e-05,
1630
+ "loss": 2.1256,
1631
+ "step": 20900
1632
+ },
1633
+ {
1634
+ "epoch": 31.02,
1635
+ "learning_rate": 3.0287284144427e-05,
1636
+ "loss": 2.143,
1637
+ "step": 21000
1638
+ },
1639
+ {
1640
+ "epoch": 31.02,
1641
+ "eval_loss": 0.9985482692718506,
1642
+ "eval_runtime": 116.0424,
1643
+ "eval_samples_per_second": 17.416,
1644
+ "eval_steps_per_second": 2.18,
1645
+ "eval_wer": 2.045027214250371,
1646
+ "step": 21000
1647
+ },
1648
+ {
1649
+ "epoch": 31.17,
1650
+ "learning_rate": 3.005180533751962e-05,
1651
+ "loss": 2.0744,
1652
+ "step": 21100
1653
+ },
1654
+ {
1655
+ "epoch": 31.31,
1656
+ "learning_rate": 2.9816326530612242e-05,
1657
+ "loss": 2.0831,
1658
+ "step": 21200
1659
+ },
1660
+ {
1661
+ "epoch": 31.46,
1662
+ "learning_rate": 2.9583202511773936e-05,
1663
+ "loss": 2.1254,
1664
+ "step": 21300
1665
+ },
1666
+ {
1667
+ "epoch": 31.61,
1668
+ "learning_rate": 2.934772370486656e-05,
1669
+ "loss": 2.1357,
1670
+ "step": 21400
1671
+ },
1672
+ {
1673
+ "epoch": 31.76,
1674
+ "learning_rate": 2.911224489795918e-05,
1675
+ "loss": 2.1338,
1676
+ "step": 21500
1677
+ },
1678
+ {
1679
+ "epoch": 31.76,
1680
+ "eval_loss": 0.9932034611701965,
1681
+ "eval_runtime": 114.6961,
1682
+ "eval_samples_per_second": 17.62,
1683
+ "eval_steps_per_second": 2.206,
1684
+ "eval_wer": 2.0024740227610094,
1685
+ "step": 21500
1686
+ },
1687
+ {
1688
+ "epoch": 31.91,
1689
+ "learning_rate": 2.8876766091051805e-05,
1690
+ "loss": 2.1053,
1691
+ "step": 21600
1692
+ },
1693
+ {
1694
+ "epoch": 32.05,
1695
+ "learning_rate": 2.8641287284144426e-05,
1696
+ "loss": 2.1111,
1697
+ "step": 21700
1698
+ },
1699
+ {
1700
+ "epoch": 32.2,
1701
+ "learning_rate": 2.8405808477237045e-05,
1702
+ "loss": 2.1028,
1703
+ "step": 21800
1704
+ },
1705
+ {
1706
+ "epoch": 32.35,
1707
+ "learning_rate": 2.817032967032967e-05,
1708
+ "loss": 2.0879,
1709
+ "step": 21900
1710
+ },
1711
+ {
1712
+ "epoch": 32.5,
1713
+ "learning_rate": 2.793485086342229e-05,
1714
+ "loss": 2.1076,
1715
+ "step": 22000
1716
+ },
1717
+ {
1718
+ "epoch": 32.5,
1719
+ "eval_loss": 0.9902665019035339,
1720
+ "eval_runtime": 120.6987,
1721
+ "eval_samples_per_second": 16.744,
1722
+ "eval_steps_per_second": 2.096,
1723
+ "eval_wer": 2.0504700643245917,
1724
+ "step": 22000
1725
+ },
1726
+ {
1727
+ "epoch": 32.64,
1728
+ "learning_rate": 2.769937205651491e-05,
1729
+ "loss": 2.1107,
1730
+ "step": 22100
1731
+ },
1732
+ {
1733
+ "epoch": 32.79,
1734
+ "learning_rate": 2.7463893249607535e-05,
1735
+ "loss": 2.0953,
1736
+ "step": 22200
1737
+ },
1738
+ {
1739
+ "epoch": 32.94,
1740
+ "learning_rate": 2.7228414442700154e-05,
1741
+ "loss": 2.0619,
1742
+ "step": 22300
1743
+ },
1744
+ {
1745
+ "epoch": 33.09,
1746
+ "learning_rate": 2.6992935635792776e-05,
1747
+ "loss": 2.0531,
1748
+ "step": 22400
1749
+ },
1750
+ {
1751
+ "epoch": 33.23,
1752
+ "learning_rate": 2.6757456828885397e-05,
1753
+ "loss": 2.0519,
1754
+ "step": 22500
1755
+ },
1756
+ {
1757
+ "epoch": 33.23,
1758
+ "eval_loss": 0.9833839535713196,
1759
+ "eval_runtime": 116.5317,
1760
+ "eval_samples_per_second": 17.343,
1761
+ "eval_steps_per_second": 2.171,
1762
+ "eval_wer": 2.07372587827808,
1763
+ "step": 22500
1764
+ },
1765
+ {
1766
+ "epoch": 33.38,
1767
+ "learning_rate": 2.652197802197802e-05,
1768
+ "loss": 2.0493,
1769
+ "step": 22600
1770
+ },
1771
+ {
1772
+ "epoch": 33.53,
1773
+ "learning_rate": 2.6286499215070644e-05,
1774
+ "loss": 2.0749,
1775
+ "step": 22700
1776
+ },
1777
+ {
1778
+ "epoch": 33.68,
1779
+ "learning_rate": 2.6051020408163263e-05,
1780
+ "loss": 2.0838,
1781
+ "step": 22800
1782
+ },
1783
+ {
1784
+ "epoch": 33.83,
1785
+ "learning_rate": 2.5815541601255884e-05,
1786
+ "loss": 2.0629,
1787
+ "step": 22900
1788
+ },
1789
+ {
1790
+ "epoch": 33.97,
1791
+ "learning_rate": 2.5580062794348506e-05,
1792
+ "loss": 2.0534,
1793
+ "step": 23000
1794
+ },
1795
+ {
1796
+ "epoch": 33.97,
1797
+ "eval_loss": 0.9755652546882629,
1798
+ "eval_runtime": 114.923,
1799
+ "eval_samples_per_second": 17.586,
1800
+ "eval_steps_per_second": 2.201,
1801
+ "eval_wer": 2.024740227610094,
1802
+ "step": 23000
1803
+ },
1804
+ {
1805
+ "epoch": 34.12,
1806
+ "learning_rate": 2.5344583987441128e-05,
1807
+ "loss": 2.067,
1808
+ "step": 23100
1809
+ },
1810
+ {
1811
+ "epoch": 34.27,
1812
+ "learning_rate": 2.5109105180533746e-05,
1813
+ "loss": 2.0252,
1814
+ "step": 23200
1815
+ },
1816
+ {
1817
+ "epoch": 34.42,
1818
+ "learning_rate": 2.487362637362637e-05,
1819
+ "loss": 2.0483,
1820
+ "step": 23300
1821
+ },
1822
+ {
1823
+ "epoch": 34.56,
1824
+ "learning_rate": 2.4638147566718993e-05,
1825
+ "loss": 2.0464,
1826
+ "step": 23400
1827
+ },
1828
+ {
1829
+ "epoch": 34.71,
1830
+ "learning_rate": 2.4402668759811615e-05,
1831
+ "loss": 2.0121,
1832
+ "step": 23500
1833
+ },
1834
+ {
1835
+ "epoch": 34.71,
1836
+ "eval_loss": 0.968792736530304,
1837
+ "eval_runtime": 114.3088,
1838
+ "eval_samples_per_second": 17.68,
1839
+ "eval_steps_per_second": 2.213,
1840
+ "eval_wer": 2.1439881246907473,
1841
+ "step": 23500
1842
+ },
1843
+ {
1844
+ "epoch": 34.86,
1845
+ "learning_rate": 2.4167189952904237e-05,
1846
+ "loss": 2.036,
1847
+ "step": 23600
1848
+ },
1849
+ {
1850
+ "epoch": 35.01,
1851
+ "learning_rate": 2.3931711145996855e-05,
1852
+ "loss": 2.013,
1853
+ "step": 23700
1854
+ },
1855
+ {
1856
+ "epoch": 35.16,
1857
+ "learning_rate": 2.369623233908948e-05,
1858
+ "loss": 2.0043,
1859
+ "step": 23800
1860
+ },
1861
+ {
1862
+ "epoch": 35.3,
1863
+ "learning_rate": 2.3460753532182102e-05,
1864
+ "loss": 2.037,
1865
+ "step": 23900
1866
+ },
1867
+ {
1868
+ "epoch": 35.45,
1869
+ "learning_rate": 2.322527472527472e-05,
1870
+ "loss": 2.0161,
1871
+ "step": 24000
1872
+ },
1873
+ {
1874
+ "epoch": 35.45,
1875
+ "eval_loss": 0.9581586718559265,
1876
+ "eval_runtime": 115.925,
1877
+ "eval_samples_per_second": 17.434,
1878
+ "eval_steps_per_second": 2.182,
1879
+ "eval_wer": 2.1232063334982683,
1880
+ "step": 24000
1881
+ },
1882
+ {
1883
+ "epoch": 35.6,
1884
+ "learning_rate": 2.2989795918367346e-05,
1885
+ "loss": 2.0256,
1886
+ "step": 24100
1887
+ },
1888
+ {
1889
+ "epoch": 35.75,
1890
+ "learning_rate": 2.2754317111459968e-05,
1891
+ "loss": 2.0265,
1892
+ "step": 24200
1893
+ },
1894
+ {
1895
+ "epoch": 35.89,
1896
+ "learning_rate": 2.251883830455259e-05,
1897
+ "loss": 2.0298,
1898
+ "step": 24300
1899
+ },
1900
+ {
1901
+ "epoch": 36.04,
1902
+ "learning_rate": 2.228335949764521e-05,
1903
+ "loss": 2.0028,
1904
+ "step": 24400
1905
+ },
1906
+ {
1907
+ "epoch": 36.19,
1908
+ "learning_rate": 2.204788069073783e-05,
1909
+ "loss": 2.0178,
1910
+ "step": 24500
1911
+ },
1912
+ {
1913
+ "epoch": 36.19,
1914
+ "eval_loss": 0.9480372071266174,
1915
+ "eval_runtime": 116.8212,
1916
+ "eval_samples_per_second": 17.3,
1917
+ "eval_steps_per_second": 2.166,
1918
+ "eval_wer": 2.0895596239485403,
1919
+ "step": 24500
1920
+ },
1921
+ {
1922
+ "epoch": 36.34,
1923
+ "learning_rate": 2.1812401883830455e-05,
1924
+ "loss": 2.008,
1925
+ "step": 24600
1926
+ },
1927
+ {
1928
+ "epoch": 36.48,
1929
+ "learning_rate": 2.1576923076923076e-05,
1930
+ "loss": 2.0132,
1931
+ "step": 24700
1932
+ },
1933
+ {
1934
+ "epoch": 36.63,
1935
+ "learning_rate": 2.1341444270015695e-05,
1936
+ "loss": 2.0204,
1937
+ "step": 24800
1938
+ },
1939
+ {
1940
+ "epoch": 36.78,
1941
+ "learning_rate": 2.110596546310832e-05,
1942
+ "loss": 1.9806,
1943
+ "step": 24900
1944
+ },
1945
+ {
1946
+ "epoch": 36.93,
1947
+ "learning_rate": 2.087048665620094e-05,
1948
+ "loss": 2.0154,
1949
+ "step": 25000
1950
+ },
1951
+ {
1952
+ "epoch": 36.93,
1953
+ "eval_loss": 0.9483017325401306,
1954
+ "eval_runtime": 117.4294,
1955
+ "eval_samples_per_second": 17.21,
1956
+ "eval_steps_per_second": 2.154,
1957
+ "eval_wer": 2.078673923800099,
1958
+ "step": 25000
1959
+ },
1960
+ {
1961
+ "epoch": 37.08,
1962
+ "learning_rate": 2.063500784929356e-05,
1963
+ "loss": 1.997,
1964
+ "step": 25100
1965
+ },
1966
+ {
1967
+ "epoch": 37.22,
1968
+ "learning_rate": 2.0399529042386185e-05,
1969
+ "loss": 1.9712,
1970
+ "step": 25200
1971
+ },
1972
+ {
1973
+ "epoch": 37.37,
1974
+ "learning_rate": 2.0164050235478804e-05,
1975
+ "loss": 2.0131,
1976
+ "step": 25300
1977
+ },
1978
+ {
1979
+ "epoch": 37.52,
1980
+ "learning_rate": 1.992857142857143e-05,
1981
+ "loss": 1.9605,
1982
+ "step": 25400
1983
+ },
1984
+ {
1985
+ "epoch": 37.67,
1986
+ "learning_rate": 1.9695447409733123e-05,
1987
+ "loss": 1.9966,
1988
+ "step": 25500
1989
+ },
1990
+ {
1991
+ "epoch": 37.67,
1992
+ "eval_loss": 0.940608024597168,
1993
+ "eval_runtime": 115.2635,
1994
+ "eval_samples_per_second": 17.534,
1995
+ "eval_steps_per_second": 2.195,
1996
+ "eval_wer": 2.0296882731321126,
1997
+ "step": 25500
1998
+ },
1999
+ {
2000
+ "epoch": 37.81,
2001
+ "learning_rate": 1.945996860282574e-05,
2002
+ "loss": 1.9879,
2003
+ "step": 25600
2004
+ },
2005
+ {
2006
+ "epoch": 37.96,
2007
+ "learning_rate": 1.9224489795918367e-05,
2008
+ "loss": 1.9836,
2009
+ "step": 25700
2010
+ },
2011
+ {
2012
+ "epoch": 38.11,
2013
+ "learning_rate": 1.8989010989010988e-05,
2014
+ "loss": 1.9872,
2015
+ "step": 25800
2016
+ },
2017
+ {
2018
+ "epoch": 38.26,
2019
+ "learning_rate": 1.8753532182103607e-05,
2020
+ "loss": 1.9684,
2021
+ "step": 25900
2022
+ },
2023
+ {
2024
+ "epoch": 38.4,
2025
+ "learning_rate": 1.851805337519623e-05,
2026
+ "loss": 1.9753,
2027
+ "step": 26000
2028
+ },
2029
+ {
2030
+ "epoch": 38.4,
2031
+ "eval_loss": 0.9418594837188721,
2032
+ "eval_runtime": 115.7124,
2033
+ "eval_samples_per_second": 17.466,
2034
+ "eval_steps_per_second": 2.186,
2035
+ "eval_wer": 2.0346363186541314,
2036
+ "step": 26000
2037
+ },
2038
+ {
2039
+ "epoch": 38.55,
2040
+ "learning_rate": 1.828257456828885e-05,
2041
+ "loss": 1.9926,
2042
+ "step": 26100
2043
+ },
2044
+ {
2045
+ "epoch": 38.7,
2046
+ "learning_rate": 1.8047095761381475e-05,
2047
+ "loss": 1.9685,
2048
+ "step": 26200
2049
+ },
2050
+ {
2051
+ "epoch": 38.85,
2052
+ "learning_rate": 1.7811616954474097e-05,
2053
+ "loss": 1.9707,
2054
+ "step": 26300
2055
+ },
2056
+ {
2057
+ "epoch": 39.0,
2058
+ "learning_rate": 1.7576138147566716e-05,
2059
+ "loss": 1.9477,
2060
+ "step": 26400
2061
+ },
2062
+ {
2063
+ "epoch": 39.14,
2064
+ "learning_rate": 1.7340659340659337e-05,
2065
+ "loss": 1.9524,
2066
+ "step": 26500
2067
+ },
2068
+ {
2069
+ "epoch": 39.14,
2070
+ "eval_loss": 0.927354097366333,
2071
+ "eval_runtime": 115.8614,
2072
+ "eval_samples_per_second": 17.443,
2073
+ "eval_steps_per_second": 2.184,
2074
+ "eval_wer": 2.0697674418604652,
2075
+ "step": 26500
2076
+ },
2077
+ {
2078
+ "epoch": 39.29,
2079
+ "learning_rate": 1.7105180533751963e-05,
2080
+ "loss": 1.9673,
2081
+ "step": 26600
2082
+ },
2083
+ {
2084
+ "epoch": 39.44,
2085
+ "learning_rate": 1.6869701726844584e-05,
2086
+ "loss": 1.9802,
2087
+ "step": 26700
2088
+ },
2089
+ {
2090
+ "epoch": 39.59,
2091
+ "learning_rate": 1.6634222919937203e-05,
2092
+ "loss": 1.9408,
2093
+ "step": 26800
2094
+ },
2095
+ {
2096
+ "epoch": 39.73,
2097
+ "learning_rate": 1.6398744113029824e-05,
2098
+ "loss": 1.9482,
2099
+ "step": 26900
2100
+ },
2101
+ {
2102
+ "epoch": 39.88,
2103
+ "learning_rate": 1.6163265306122446e-05,
2104
+ "loss": 1.9427,
2105
+ "step": 27000
2106
+ },
2107
+ {
2108
+ "epoch": 39.88,
2109
+ "eval_loss": 0.9232719540596008,
2110
+ "eval_runtime": 116.3191,
2111
+ "eval_samples_per_second": 17.375,
2112
+ "eval_steps_per_second": 2.175,
2113
+ "eval_wer": 2.078673923800099,
2114
+ "step": 27000
2115
+ },
2116
+ {
2117
+ "epoch": 40.03,
2118
+ "learning_rate": 1.592778649921507e-05,
2119
+ "loss": 1.9653,
2120
+ "step": 27100
2121
+ },
2122
+ {
2123
+ "epoch": 40.18,
2124
+ "learning_rate": 1.569230769230769e-05,
2125
+ "loss": 1.9157,
2126
+ "step": 27200
2127
+ },
2128
+ {
2129
+ "epoch": 40.32,
2130
+ "learning_rate": 1.545682888540031e-05,
2131
+ "loss": 1.9493,
2132
+ "step": 27300
2133
+ },
2134
+ {
2135
+ "epoch": 40.47,
2136
+ "learning_rate": 1.5221350078492935e-05,
2137
+ "loss": 1.8974,
2138
+ "step": 27400
2139
+ },
2140
+ {
2141
+ "epoch": 40.62,
2142
+ "learning_rate": 1.4985871271585557e-05,
2143
+ "loss": 1.9258,
2144
+ "step": 27500
2145
+ },
2146
+ {
2147
+ "epoch": 40.62,
2148
+ "eval_loss": 0.9182448983192444,
2149
+ "eval_runtime": 115.4065,
2150
+ "eval_samples_per_second": 17.512,
2151
+ "eval_steps_per_second": 2.192,
2152
+ "eval_wer": 2.052944087085601,
2153
+ "step": 27500
2154
+ },
2155
+ {
2156
+ "epoch": 40.77,
2157
+ "learning_rate": 1.4750392464678177e-05,
2158
+ "loss": 1.9354,
2159
+ "step": 27600
2160
+ },
2161
+ {
2162
+ "epoch": 40.92,
2163
+ "learning_rate": 1.4514913657770799e-05,
2164
+ "loss": 1.952,
2165
+ "step": 27700
2166
+ },
2167
+ {
2168
+ "epoch": 41.06,
2169
+ "learning_rate": 1.4281789638932496e-05,
2170
+ "loss": 1.9231,
2171
+ "step": 27800
2172
+ },
2173
+ {
2174
+ "epoch": 41.21,
2175
+ "learning_rate": 1.4046310832025116e-05,
2176
+ "loss": 1.9465,
2177
+ "step": 27900
2178
+ },
2179
+ {
2180
+ "epoch": 41.36,
2181
+ "learning_rate": 1.3810832025117738e-05,
2182
+ "loss": 1.9031,
2183
+ "step": 28000
2184
+ },
2185
+ {
2186
+ "epoch": 41.36,
2187
+ "eval_loss": 0.9149593114852905,
2188
+ "eval_runtime": 116.2555,
2189
+ "eval_samples_per_second": 17.384,
2190
+ "eval_steps_per_second": 2.176,
2191
+ "eval_wer": 2.078673923800099,
2192
+ "step": 28000
2193
+ },
2194
+ {
2195
+ "epoch": 41.51,
2196
+ "learning_rate": 1.357535321821036e-05,
2197
+ "loss": 1.9361,
2198
+ "step": 28100
2199
+ },
2200
+ {
2201
+ "epoch": 41.65,
2202
+ "learning_rate": 1.3342229199372054e-05,
2203
+ "loss": 1.916,
2204
+ "step": 28200
2205
+ },
2206
+ {
2207
+ "epoch": 41.8,
2208
+ "learning_rate": 1.3106750392464677e-05,
2209
+ "loss": 1.9149,
2210
+ "step": 28300
2211
+ },
2212
+ {
2213
+ "epoch": 41.95,
2214
+ "learning_rate": 1.2871271585557299e-05,
2215
+ "loss": 1.9037,
2216
+ "step": 28400
2217
+ },
2218
+ {
2219
+ "epoch": 42.1,
2220
+ "learning_rate": 1.263579277864992e-05,
2221
+ "loss": 1.9297,
2222
+ "step": 28500
2223
+ },
2224
+ {
2225
+ "epoch": 42.1,
2226
+ "eval_loss": 0.9040070176124573,
2227
+ "eval_runtime": 113.8901,
2228
+ "eval_samples_per_second": 17.745,
2229
+ "eval_steps_per_second": 2.221,
2230
+ "eval_wer": 2.0504700643245917,
2231
+ "step": 28500
2232
+ },
2233
+ {
2234
+ "epoch": 42.25,
2235
+ "learning_rate": 1.2400313971742541e-05,
2236
+ "loss": 1.8855,
2237
+ "step": 28600
2238
+ },
2239
+ {
2240
+ "epoch": 42.39,
2241
+ "learning_rate": 1.2164835164835163e-05,
2242
+ "loss": 1.9095,
2243
+ "step": 28700
2244
+ },
2245
+ {
2246
+ "epoch": 42.54,
2247
+ "learning_rate": 1.1929356357927786e-05,
2248
+ "loss": 1.8913,
2249
+ "step": 28800
2250
+ },
2251
+ {
2252
+ "epoch": 42.69,
2253
+ "learning_rate": 1.1693877551020408e-05,
2254
+ "loss": 1.8685,
2255
+ "step": 28900
2256
+ },
2257
+ {
2258
+ "epoch": 42.84,
2259
+ "learning_rate": 1.1458398744113028e-05,
2260
+ "loss": 1.9041,
2261
+ "step": 29000
2262
+ },
2263
+ {
2264
+ "epoch": 42.84,
2265
+ "eval_loss": 0.9008907675743103,
2266
+ "eval_runtime": 114.9643,
2267
+ "eval_samples_per_second": 17.579,
2268
+ "eval_steps_per_second": 2.201,
2269
+ "eval_wer": 2.05789213260762,
2270
+ "step": 29000
2271
+ },
2272
+ {
2273
+ "epoch": 42.98,
2274
+ "learning_rate": 1.122291993720565e-05,
2275
+ "loss": 1.8963,
2276
+ "step": 29100
2277
+ },
2278
+ {
2279
+ "epoch": 43.13,
2280
+ "learning_rate": 1.0987441130298273e-05,
2281
+ "loss": 1.9068,
2282
+ "step": 29200
2283
+ },
2284
+ {
2285
+ "epoch": 43.28,
2286
+ "learning_rate": 1.0751962323390895e-05,
2287
+ "loss": 1.9003,
2288
+ "step": 29300
2289
+ },
2290
+ {
2291
+ "epoch": 43.43,
2292
+ "learning_rate": 1.0516483516483515e-05,
2293
+ "loss": 1.891,
2294
+ "step": 29400
2295
+ },
2296
+ {
2297
+ "epoch": 43.57,
2298
+ "learning_rate": 1.0281004709576137e-05,
2299
+ "loss": 1.8929,
2300
+ "step": 29500
2301
+ },
2302
+ {
2303
+ "epoch": 43.57,
2304
+ "eval_loss": 0.8968304991722107,
2305
+ "eval_runtime": 116.4378,
2306
+ "eval_samples_per_second": 17.357,
2307
+ "eval_steps_per_second": 2.173,
2308
+ "eval_wer": 2.032657100445324,
2309
+ "step": 29500
2310
+ },
2311
+ {
2312
+ "epoch": 43.72,
2313
+ "learning_rate": 1.0045525902668759e-05,
2314
+ "loss": 1.8827,
2315
+ "step": 29600
2316
+ },
2317
+ {
2318
+ "epoch": 43.87,
2319
+ "learning_rate": 9.810047095761382e-06,
2320
+ "loss": 1.8862,
2321
+ "step": 29700
2322
+ },
2323
+ {
2324
+ "epoch": 44.02,
2325
+ "learning_rate": 9.574568288854002e-06,
2326
+ "loss": 1.8787,
2327
+ "step": 29800
2328
+ },
2329
+ {
2330
+ "epoch": 44.17,
2331
+ "learning_rate": 9.339089481946624e-06,
2332
+ "loss": 1.8501,
2333
+ "step": 29900
2334
+ },
2335
+ {
2336
+ "epoch": 44.31,
2337
+ "learning_rate": 9.103610675039246e-06,
2338
+ "loss": 1.9077,
2339
+ "step": 30000
2340
+ },
2341
+ {
2342
+ "epoch": 44.31,
2343
+ "eval_loss": 0.8953686952590942,
2344
+ "eval_runtime": 115.4838,
2345
+ "eval_samples_per_second": 17.5,
2346
+ "eval_steps_per_second": 2.191,
2347
+ "eval_wer": 2.061850569025235,
2348
+ "step": 30000
2349
+ },
2350
+ {
2351
+ "epoch": 44.46,
2352
+ "learning_rate": 8.868131868131868e-06,
2353
+ "loss": 1.8804,
2354
+ "step": 30100
2355
+ },
2356
+ {
2357
+ "epoch": 44.61,
2358
+ "learning_rate": 8.63265306122449e-06,
2359
+ "loss": 1.8723,
2360
+ "step": 30200
2361
+ },
2362
+ {
2363
+ "epoch": 44.76,
2364
+ "learning_rate": 8.397174254317111e-06,
2365
+ "loss": 1.8577,
2366
+ "step": 30300
2367
+ },
2368
+ {
2369
+ "epoch": 44.9,
2370
+ "learning_rate": 8.161695447409733e-06,
2371
+ "loss": 1.8811,
2372
+ "step": 30400
2373
+ },
2374
+ {
2375
+ "epoch": 45.05,
2376
+ "learning_rate": 7.928571428571429e-06,
2377
+ "loss": 1.8504,
2378
+ "step": 30500
2379
+ },
2380
+ {
2381
+ "epoch": 45.05,
2382
+ "eval_loss": 0.892192542552948,
2383
+ "eval_runtime": 116.2513,
2384
+ "eval_samples_per_second": 17.385,
2385
+ "eval_steps_per_second": 2.176,
2386
+ "eval_wer": 2.07372587827808,
2387
+ "step": 30500
2388
+ },
2389
+ {
2390
+ "epoch": 45.2,
2391
+ "learning_rate": 7.693092621664049e-06,
2392
+ "loss": 1.861,
2393
+ "step": 30600
2394
+ },
2395
+ {
2396
+ "epoch": 45.35,
2397
+ "learning_rate": 7.457613814756671e-06,
2398
+ "loss": 1.8496,
2399
+ "step": 30700
2400
+ },
2401
+ {
2402
+ "epoch": 45.49,
2403
+ "learning_rate": 7.222135007849293e-06,
2404
+ "loss": 1.8612,
2405
+ "step": 30800
2406
+ },
2407
+ {
2408
+ "epoch": 45.64,
2409
+ "learning_rate": 6.986656200941915e-06,
2410
+ "loss": 1.865,
2411
+ "step": 30900
2412
+ },
2413
+ {
2414
+ "epoch": 45.79,
2415
+ "learning_rate": 6.751177394034536e-06,
2416
+ "loss": 1.8732,
2417
+ "step": 31000
2418
+ },
2419
+ {
2420
+ "epoch": 45.79,
2421
+ "eval_loss": 0.8897548317909241,
2422
+ "eval_runtime": 116.5927,
2423
+ "eval_samples_per_second": 17.334,
2424
+ "eval_steps_per_second": 2.17,
2425
+ "eval_wer": 2.0682830282038593,
2426
+ "step": 31000
2427
+ },
2428
+ {
2429
+ "epoch": 45.94,
2430
+ "learning_rate": 6.5156985871271585e-06,
2431
+ "loss": 1.8374,
2432
+ "step": 31100
2433
+ },
2434
+ {
2435
+ "epoch": 46.09,
2436
+ "learning_rate": 6.280219780219779e-06,
2437
+ "loss": 1.8395,
2438
+ "step": 31200
2439
+ },
2440
+ {
2441
+ "epoch": 46.23,
2442
+ "learning_rate": 6.044740973312402e-06,
2443
+ "loss": 1.8377,
2444
+ "step": 31300
2445
+ },
2446
+ {
2447
+ "epoch": 46.38,
2448
+ "learning_rate": 5.809262166405023e-06,
2449
+ "loss": 1.87,
2450
+ "step": 31400
2451
+ },
2452
+ {
2453
+ "epoch": 46.53,
2454
+ "learning_rate": 5.573783359497644e-06,
2455
+ "loss": 1.877,
2456
+ "step": 31500
2457
+ },
2458
+ {
2459
+ "epoch": 46.53,
2460
+ "eval_loss": 0.8848925828933716,
2461
+ "eval_runtime": 116.1465,
2462
+ "eval_samples_per_second": 17.4,
2463
+ "eval_steps_per_second": 2.178,
2464
+ "eval_wer": 2.0588817417120238,
2465
+ "step": 31500
2466
+ },
2467
+ {
2468
+ "epoch": 46.68,
2469
+ "learning_rate": 5.3383045525902665e-06,
2470
+ "loss": 1.8256,
2471
+ "step": 31600
2472
+ },
2473
+ {
2474
+ "epoch": 46.82,
2475
+ "learning_rate": 5.1028257456828875e-06,
2476
+ "loss": 1.8317,
2477
+ "step": 31700
2478
+ },
2479
+ {
2480
+ "epoch": 46.97,
2481
+ "learning_rate": 4.86734693877551e-06,
2482
+ "loss": 1.8579,
2483
+ "step": 31800
2484
+ },
2485
+ {
2486
+ "epoch": 47.12,
2487
+ "learning_rate": 4.631868131868132e-06,
2488
+ "loss": 1.839,
2489
+ "step": 31900
2490
+ },
2491
+ {
2492
+ "epoch": 47.27,
2493
+ "learning_rate": 4.396389324960754e-06,
2494
+ "loss": 1.8587,
2495
+ "step": 32000
2496
+ },
2497
+ {
2498
+ "epoch": 47.27,
2499
+ "eval_loss": 0.8843359351158142,
2500
+ "eval_runtime": 116.5866,
2501
+ "eval_samples_per_second": 17.335,
2502
+ "eval_steps_per_second": 2.17,
2503
+ "eval_wer": 2.045027214250371,
2504
+ "step": 32000
2505
+ },
2506
+ {
2507
+ "epoch": 47.41,
2508
+ "learning_rate": 4.160910518053375e-06,
2509
+ "loss": 1.8419,
2510
+ "step": 32100
2511
+ },
2512
+ {
2513
+ "epoch": 47.56,
2514
+ "learning_rate": 3.925431711145996e-06,
2515
+ "loss": 1.8639,
2516
+ "step": 32200
2517
+ },
2518
+ {
2519
+ "epoch": 47.71,
2520
+ "learning_rate": 3.6899529042386186e-06,
2521
+ "loss": 1.8395,
2522
+ "step": 32300
2523
+ },
2524
+ {
2525
+ "epoch": 47.86,
2526
+ "learning_rate": 3.45447409733124e-06,
2527
+ "loss": 1.8369,
2528
+ "step": 32400
2529
+ },
2530
+ {
2531
+ "epoch": 48.01,
2532
+ "learning_rate": 3.2189952904238617e-06,
2533
+ "loss": 1.8236,
2534
+ "step": 32500
2535
+ },
2536
+ {
2537
+ "epoch": 48.01,
2538
+ "eval_loss": 0.8810222148895264,
2539
+ "eval_runtime": 115.817,
2540
+ "eval_samples_per_second": 17.45,
2541
+ "eval_steps_per_second": 2.184,
2542
+ "eval_wer": 2.0554181098466104,
2543
+ "step": 32500
2544
+ },
2545
+ {
2546
+ "epoch": 48.15,
2547
+ "learning_rate": 2.9835164835164835e-06,
2548
+ "loss": 1.8468,
2549
+ "step": 32600
2550
+ },
2551
+ {
2552
+ "epoch": 48.3,
2553
+ "learning_rate": 2.7503924646781788e-06,
2554
+ "loss": 1.8326,
2555
+ "step": 32700
2556
+ },
2557
+ {
2558
+ "epoch": 48.45,
2559
+ "learning_rate": 2.5149136577708006e-06,
2560
+ "loss": 1.8279,
2561
+ "step": 32800
2562
+ },
2563
+ {
2564
+ "epoch": 48.6,
2565
+ "learning_rate": 2.2794348508634223e-06,
2566
+ "loss": 1.8324,
2567
+ "step": 32900
2568
+ },
2569
+ {
2570
+ "epoch": 48.74,
2571
+ "learning_rate": 2.043956043956044e-06,
2572
+ "loss": 1.8392,
2573
+ "step": 33000
2574
+ },
2575
+ {
2576
+ "epoch": 48.74,
2577
+ "eval_loss": 0.8820456266403198,
2578
+ "eval_runtime": 115.6891,
2579
+ "eval_samples_per_second": 17.469,
2580
+ "eval_steps_per_second": 2.187,
2581
+ "eval_wer": 2.0573973280554183,
2582
+ "step": 33000
2583
+ }
2584
+ ],
2585
+ "max_steps": 33850,
2586
+ "num_train_epochs": 50,
2587
+ "total_flos": 1.5017069176443896e+20,
2588
+ "trial_name": null,
2589
+ "trial_params": null
2590
+ }
checkpoint-33000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:865e00e6503ba9a84aae151f6fca7048aa8118080935253305e52fcf3ebbf980
3
+ size 3055
checkpoint-33500/config.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
+ "activation_dropout": 0.1,
4
+ "adapter_kernel_size": 3,
5
+ "adapter_stride": 2,
6
+ "add_adapter": false,
7
+ "apply_spec_augment": true,
8
+ "architectures": [
9
+ "Wav2Vec2ForCTC"
10
+ ],
11
+ "attention_dropout": 0.0,
12
+ "bos_token_id": 1,
13
+ "classifier_proj_size": 256,
14
+ "codevector_dim": 768,
15
+ "contrastive_logits_temperature": 0.1,
16
+ "conv_bias": true,
17
+ "conv_dim": [
18
+ 512,
19
+ 512,
20
+ 512,
21
+ 512,
22
+ 512,
23
+ 512,
24
+ 512
25
+ ],
26
+ "conv_kernel": [
27
+ 10,
28
+ 3,
29
+ 3,
30
+ 3,
31
+ 3,
32
+ 2,
33
+ 2
34
+ ],
35
+ "conv_stride": [
36
+ 5,
37
+ 2,
38
+ 2,
39
+ 2,
40
+ 2,
41
+ 2,
42
+ 2
43
+ ],
44
+ "ctc_loss_reduction": "mean",
45
+ "ctc_zero_infinity": false,
46
+ "diversity_loss_weight": 0.1,
47
+ "do_stable_layer_norm": true,
48
+ "eos_token_id": 2,
49
+ "feat_extract_activation": "gelu",
50
+ "feat_extract_dropout": 0.0,
51
+ "feat_extract_norm": "layer",
52
+ "feat_proj_dropout": 0.0,
53
+ "feat_quantizer_dropout": 0.0,
54
+ "final_dropout": 0.0,
55
+ "hidden_act": "gelu",
56
+ "hidden_dropout": 0.0,
57
+ "hidden_size": 1024,
58
+ "initializer_range": 0.02,
59
+ "intermediate_size": 4096,
60
+ "layer_norm_eps": 1e-05,
61
+ "layerdrop": 0.0,
62
+ "mask_feature_length": 64,
63
+ "mask_feature_min_masks": 0,
64
+ "mask_feature_prob": 0.25,
65
+ "mask_time_length": 10,
66
+ "mask_time_min_masks": 2,
67
+ "mask_time_prob": 0.75,
68
+ "model_type": "wav2vec2",
69
+ "num_adapter_layers": 3,
70
+ "num_attention_heads": 16,
71
+ "num_codevector_groups": 2,
72
+ "num_codevectors_per_group": 320,
73
+ "num_conv_pos_embedding_groups": 16,
74
+ "num_conv_pos_embeddings": 128,
75
+ "num_feat_extract_layers": 7,
76
+ "num_hidden_layers": 24,
77
+ "num_negatives": 100,
78
+ "output_hidden_size": 1024,
79
+ "pad_token_id": 4649,
80
+ "proj_codevector_dim": 768,
81
+ "tdnn_dilation": [
82
+ 1,
83
+ 2,
84
+ 3,
85
+ 1,
86
+ 1
87
+ ],
88
+ "tdnn_dim": [
89
+ 512,
90
+ 512,
91
+ 512,
92
+ 512,
93
+ 1500
94
+ ],
95
+ "tdnn_kernel": [
96
+ 5,
97
+ 3,
98
+ 3,
99
+ 1,
100
+ 1
101
+ ],
102
+ "torch_dtype": "float32",
103
+ "transformers_version": "4.17.0.dev0",
104
+ "use_weighted_layer_sum": false,
105
+ "vocab_size": 4652,
106
+ "xvector_output_dim": 512
107
+ }
checkpoint-33500/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8ce13f36ca37b829381fbe22cd19b7908cd75e6bf008496ff7ea20fd75e52dfa
3
+ size 2528205329
checkpoint-33500/preprocessor_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": true,
8
+ "sampling_rate": 16000
9
+ }
checkpoint-33500/pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1273ff6d82e5f2311dc451329a36834857a6856481f96163b8c2c89d9f034192
3
+ size 1280996913
checkpoint-33500/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9469728500890cfe2fd689cbd21300ccf0d61a377c7ffe451a5f96fae3b9eb4a
3
+ size 14567
checkpoint-33500/scaler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2550a9ec6075dcae9af801d3288748d3d8d14838bad09563d150df43b06888bd
3
+ size 559
checkpoint-33500/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8feb8ac0ec9372196d566259c3023811a0d74e4dca394e20b3361260fe3e90ec
3
+ size 623
checkpoint-33500/trainer_state.json ADDED
@@ -0,0 +1,2629 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 49.4828349944629,
5
+ "global_step": 33500,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 0.15,
12
+ "learning_rate": 3.6375e-06,
13
+ "loss": 124.9665,
14
+ "step": 100
15
+ },
16
+ {
17
+ "epoch": 0.3,
18
+ "learning_rate": 7.3875e-06,
19
+ "loss": 92.673,
20
+ "step": 200
21
+ },
22
+ {
23
+ "epoch": 0.44,
24
+ "learning_rate": 1.1099999999999999e-05,
25
+ "loss": 74.8932,
26
+ "step": 300
27
+ },
28
+ {
29
+ "epoch": 0.59,
30
+ "learning_rate": 1.485e-05,
31
+ "loss": 68.0432,
32
+ "step": 400
33
+ },
34
+ {
35
+ "epoch": 0.74,
36
+ "learning_rate": 1.8599999999999998e-05,
37
+ "loss": 60.2112,
38
+ "step": 500
39
+ },
40
+ {
41
+ "epoch": 0.74,
42
+ "eval_loss": 64.81886291503906,
43
+ "eval_runtime": 129.9516,
44
+ "eval_samples_per_second": 15.552,
45
+ "eval_steps_per_second": 1.947,
46
+ "eval_wer": 1.0,
47
+ "step": 500
48
+ },
49
+ {
50
+ "epoch": 0.89,
51
+ "learning_rate": 2.2349999999999998e-05,
52
+ "loss": 51.3096,
53
+ "step": 600
54
+ },
55
+ {
56
+ "epoch": 1.03,
57
+ "learning_rate": 2.6099999999999997e-05,
58
+ "loss": 39.1106,
59
+ "step": 700
60
+ },
61
+ {
62
+ "epoch": 1.18,
63
+ "learning_rate": 2.985e-05,
64
+ "loss": 26.6843,
65
+ "step": 800
66
+ },
67
+ {
68
+ "epoch": 1.33,
69
+ "learning_rate": 3.36e-05,
70
+ "loss": 14.7864,
71
+ "step": 900
72
+ },
73
+ {
74
+ "epoch": 1.48,
75
+ "learning_rate": 3.735e-05,
76
+ "loss": 8.1128,
77
+ "step": 1000
78
+ },
79
+ {
80
+ "epoch": 1.48,
81
+ "eval_loss": 6.899676322937012,
82
+ "eval_runtime": 115.5788,
83
+ "eval_samples_per_second": 17.486,
84
+ "eval_steps_per_second": 2.189,
85
+ "eval_wer": 1.0,
86
+ "step": 1000
87
+ },
88
+ {
89
+ "epoch": 1.62,
90
+ "learning_rate": 4.11e-05,
91
+ "loss": 6.6068,
92
+ "step": 1100
93
+ },
94
+ {
95
+ "epoch": 1.77,
96
+ "learning_rate": 4.484999999999999e-05,
97
+ "loss": 6.23,
98
+ "step": 1200
99
+ },
100
+ {
101
+ "epoch": 1.92,
102
+ "learning_rate": 4.8599999999999995e-05,
103
+ "loss": 6.0972,
104
+ "step": 1300
105
+ },
106
+ {
107
+ "epoch": 2.07,
108
+ "learning_rate": 5.234999999999999e-05,
109
+ "loss": 6.0595,
110
+ "step": 1400
111
+ },
112
+ {
113
+ "epoch": 2.22,
114
+ "learning_rate": 5.6099999999999995e-05,
115
+ "loss": 6.0492,
116
+ "step": 1500
117
+ },
118
+ {
119
+ "epoch": 2.22,
120
+ "eval_loss": 5.967654228210449,
121
+ "eval_runtime": 115.432,
122
+ "eval_samples_per_second": 17.508,
123
+ "eval_steps_per_second": 2.192,
124
+ "eval_wer": 1.949529935675408,
125
+ "step": 1500
126
+ },
127
+ {
128
+ "epoch": 2.36,
129
+ "learning_rate": 5.985e-05,
130
+ "loss": 6.0266,
131
+ "step": 1600
132
+ },
133
+ {
134
+ "epoch": 2.51,
135
+ "learning_rate": 6.359999999999999e-05,
136
+ "loss": 5.9902,
137
+ "step": 1700
138
+ },
139
+ {
140
+ "epoch": 2.66,
141
+ "learning_rate": 6.735e-05,
142
+ "loss": 5.9762,
143
+ "step": 1800
144
+ },
145
+ {
146
+ "epoch": 2.81,
147
+ "learning_rate": 7.11e-05,
148
+ "loss": 5.9491,
149
+ "step": 1900
150
+ },
151
+ {
152
+ "epoch": 2.95,
153
+ "learning_rate": 7.484999999999999e-05,
154
+ "loss": 5.9326,
155
+ "step": 2000
156
+ },
157
+ {
158
+ "epoch": 2.95,
159
+ "eval_loss": 5.884542942047119,
160
+ "eval_runtime": 114.597,
161
+ "eval_samples_per_second": 17.636,
162
+ "eval_steps_per_second": 2.208,
163
+ "eval_wer": 1.409203364670955,
164
+ "step": 2000
165
+ },
166
+ {
167
+ "epoch": 3.1,
168
+ "learning_rate": 7.477394034536891e-05,
169
+ "loss": 5.9356,
170
+ "step": 2100
171
+ },
172
+ {
173
+ "epoch": 3.25,
174
+ "learning_rate": 7.453846153846153e-05,
175
+ "loss": 5.8889,
176
+ "step": 2200
177
+ },
178
+ {
179
+ "epoch": 3.4,
180
+ "learning_rate": 7.430298273155415e-05,
181
+ "loss": 5.899,
182
+ "step": 2300
183
+ },
184
+ {
185
+ "epoch": 3.54,
186
+ "learning_rate": 7.406750392464678e-05,
187
+ "loss": 5.8824,
188
+ "step": 2400
189
+ },
190
+ {
191
+ "epoch": 3.69,
192
+ "learning_rate": 7.38320251177394e-05,
193
+ "loss": 5.8763,
194
+ "step": 2500
195
+ },
196
+ {
197
+ "epoch": 3.69,
198
+ "eval_loss": 5.846009731292725,
199
+ "eval_runtime": 117.5393,
200
+ "eval_samples_per_second": 17.194,
201
+ "eval_steps_per_second": 2.152,
202
+ "eval_wer": 1.6125680356259278,
203
+ "step": 2500
204
+ },
205
+ {
206
+ "epoch": 3.84,
207
+ "learning_rate": 7.359654631083201e-05,
208
+ "loss": 5.875,
209
+ "step": 2600
210
+ },
211
+ {
212
+ "epoch": 3.99,
213
+ "learning_rate": 7.336106750392464e-05,
214
+ "loss": 5.8671,
215
+ "step": 2700
216
+ },
217
+ {
218
+ "epoch": 4.14,
219
+ "learning_rate": 7.312558869701726e-05,
220
+ "loss": 5.8591,
221
+ "step": 2800
222
+ },
223
+ {
224
+ "epoch": 4.28,
225
+ "learning_rate": 7.289010989010989e-05,
226
+ "loss": 5.8226,
227
+ "step": 2900
228
+ },
229
+ {
230
+ "epoch": 4.43,
231
+ "learning_rate": 7.265463108320251e-05,
232
+ "loss": 5.7888,
233
+ "step": 3000
234
+ },
235
+ {
236
+ "epoch": 4.43,
237
+ "eval_loss": 5.75445032119751,
238
+ "eval_runtime": 114.1832,
239
+ "eval_samples_per_second": 17.7,
240
+ "eval_steps_per_second": 2.216,
241
+ "eval_wer": 2.2033646709549726,
242
+ "step": 3000
243
+ },
244
+ {
245
+ "epoch": 4.58,
246
+ "learning_rate": 7.241915227629513e-05,
247
+ "loss": 5.8041,
248
+ "step": 3100
249
+ },
250
+ {
251
+ "epoch": 4.73,
252
+ "learning_rate": 7.218367346938774e-05,
253
+ "loss": 5.8013,
254
+ "step": 3200
255
+ },
256
+ {
257
+ "epoch": 4.87,
258
+ "learning_rate": 7.194819466248037e-05,
259
+ "loss": 5.7947,
260
+ "step": 3300
261
+ },
262
+ {
263
+ "epoch": 5.02,
264
+ "learning_rate": 7.171271585557299e-05,
265
+ "loss": 5.7802,
266
+ "step": 3400
267
+ },
268
+ {
269
+ "epoch": 5.17,
270
+ "learning_rate": 7.147723704866562e-05,
271
+ "loss": 5.735,
272
+ "step": 3500
273
+ },
274
+ {
275
+ "epoch": 5.17,
276
+ "eval_loss": 5.677657604217529,
277
+ "eval_runtime": 115.6516,
278
+ "eval_samples_per_second": 17.475,
279
+ "eval_steps_per_second": 2.188,
280
+ "eval_wer": 2.334982681840673,
281
+ "step": 3500
282
+ },
283
+ {
284
+ "epoch": 5.32,
285
+ "learning_rate": 7.124175824175823e-05,
286
+ "loss": 5.7198,
287
+ "step": 3600
288
+ },
289
+ {
290
+ "epoch": 5.47,
291
+ "learning_rate": 7.100627943485086e-05,
292
+ "loss": 5.7092,
293
+ "step": 3700
294
+ },
295
+ {
296
+ "epoch": 5.61,
297
+ "learning_rate": 7.077080062794347e-05,
298
+ "loss": 5.6613,
299
+ "step": 3800
300
+ },
301
+ {
302
+ "epoch": 5.76,
303
+ "learning_rate": 7.05353218210361e-05,
304
+ "loss": 5.6579,
305
+ "step": 3900
306
+ },
307
+ {
308
+ "epoch": 5.91,
309
+ "learning_rate": 7.029984301412873e-05,
310
+ "loss": 5.6861,
311
+ "step": 4000
312
+ },
313
+ {
314
+ "epoch": 5.91,
315
+ "eval_loss": 5.517865180969238,
316
+ "eval_runtime": 115.3653,
317
+ "eval_samples_per_second": 17.518,
318
+ "eval_steps_per_second": 2.193,
319
+ "eval_wer": 2.223156853043048,
320
+ "step": 4000
321
+ },
322
+ {
323
+ "epoch": 6.06,
324
+ "learning_rate": 7.006436420722135e-05,
325
+ "loss": 5.6024,
326
+ "step": 4100
327
+ },
328
+ {
329
+ "epoch": 6.2,
330
+ "learning_rate": 6.982888540031396e-05,
331
+ "loss": 5.5497,
332
+ "step": 4200
333
+ },
334
+ {
335
+ "epoch": 6.35,
336
+ "learning_rate": 6.959340659340659e-05,
337
+ "loss": 5.5257,
338
+ "step": 4300
339
+ },
340
+ {
341
+ "epoch": 6.5,
342
+ "learning_rate": 6.93579277864992e-05,
343
+ "loss": 5.4534,
344
+ "step": 4400
345
+ },
346
+ {
347
+ "epoch": 6.65,
348
+ "learning_rate": 6.912244897959182e-05,
349
+ "loss": 5.381,
350
+ "step": 4500
351
+ },
352
+ {
353
+ "epoch": 6.65,
354
+ "eval_loss": 5.142032146453857,
355
+ "eval_runtime": 117.6237,
356
+ "eval_samples_per_second": 17.182,
357
+ "eval_steps_per_second": 2.151,
358
+ "eval_wer": 2.18159327065809,
359
+ "step": 4500
360
+ },
361
+ {
362
+ "epoch": 6.79,
363
+ "learning_rate": 6.888697017268445e-05,
364
+ "loss": 5.3409,
365
+ "step": 4600
366
+ },
367
+ {
368
+ "epoch": 6.94,
369
+ "learning_rate": 6.865149136577708e-05,
370
+ "loss": 5.1283,
371
+ "step": 4700
372
+ },
373
+ {
374
+ "epoch": 7.09,
375
+ "learning_rate": 6.841601255886969e-05,
376
+ "loss": 4.8788,
377
+ "step": 4800
378
+ },
379
+ {
380
+ "epoch": 7.24,
381
+ "learning_rate": 6.818053375196232e-05,
382
+ "loss": 4.7235,
383
+ "step": 4900
384
+ },
385
+ {
386
+ "epoch": 7.39,
387
+ "learning_rate": 6.794505494505494e-05,
388
+ "loss": 4.625,
389
+ "step": 5000
390
+ },
391
+ {
392
+ "epoch": 7.39,
393
+ "eval_loss": 3.9019837379455566,
394
+ "eval_runtime": 116.0971,
395
+ "eval_samples_per_second": 17.408,
396
+ "eval_steps_per_second": 2.179,
397
+ "eval_wer": 2.0722414646214746,
398
+ "step": 5000
399
+ },
400
+ {
401
+ "epoch": 7.53,
402
+ "learning_rate": 6.770957613814756e-05,
403
+ "loss": 4.5404,
404
+ "step": 5100
405
+ },
406
+ {
407
+ "epoch": 7.68,
408
+ "learning_rate": 6.747409733124018e-05,
409
+ "loss": 4.4307,
410
+ "step": 5200
411
+ },
412
+ {
413
+ "epoch": 7.83,
414
+ "learning_rate": 6.723861852433281e-05,
415
+ "loss": 4.3794,
416
+ "step": 5300
417
+ },
418
+ {
419
+ "epoch": 7.98,
420
+ "learning_rate": 6.700313971742542e-05,
421
+ "loss": 4.2786,
422
+ "step": 5400
423
+ },
424
+ {
425
+ "epoch": 8.12,
426
+ "learning_rate": 6.676766091051805e-05,
427
+ "loss": 4.214,
428
+ "step": 5500
429
+ },
430
+ {
431
+ "epoch": 8.12,
432
+ "eval_loss": 3.339416027069092,
433
+ "eval_runtime": 116.9868,
434
+ "eval_samples_per_second": 17.275,
435
+ "eval_steps_per_second": 2.163,
436
+ "eval_wer": 2.1429985155863434,
437
+ "step": 5500
438
+ },
439
+ {
440
+ "epoch": 8.27,
441
+ "learning_rate": 6.653218210361068e-05,
442
+ "loss": 4.1206,
443
+ "step": 5600
444
+ },
445
+ {
446
+ "epoch": 8.42,
447
+ "learning_rate": 6.62967032967033e-05,
448
+ "loss": 4.081,
449
+ "step": 5700
450
+ },
451
+ {
452
+ "epoch": 8.57,
453
+ "learning_rate": 6.606122448979591e-05,
454
+ "loss": 4.0059,
455
+ "step": 5800
456
+ },
457
+ {
458
+ "epoch": 8.71,
459
+ "learning_rate": 6.582574568288854e-05,
460
+ "loss": 3.9251,
461
+ "step": 5900
462
+ },
463
+ {
464
+ "epoch": 8.86,
465
+ "learning_rate": 6.559262166405023e-05,
466
+ "loss": 3.8992,
467
+ "step": 6000
468
+ },
469
+ {
470
+ "epoch": 8.86,
471
+ "eval_loss": 2.9084665775299072,
472
+ "eval_runtime": 119.0907,
473
+ "eval_samples_per_second": 16.97,
474
+ "eval_steps_per_second": 2.124,
475
+ "eval_wer": 2.153389411182583,
476
+ "step": 6000
477
+ },
478
+ {
479
+ "epoch": 9.01,
480
+ "learning_rate": 6.535714285714285e-05,
481
+ "loss": 3.8494,
482
+ "step": 6100
483
+ },
484
+ {
485
+ "epoch": 9.16,
486
+ "learning_rate": 6.512166405023547e-05,
487
+ "loss": 3.7923,
488
+ "step": 6200
489
+ },
490
+ {
491
+ "epoch": 9.31,
492
+ "learning_rate": 6.48861852433281e-05,
493
+ "loss": 3.7416,
494
+ "step": 6300
495
+ },
496
+ {
497
+ "epoch": 9.45,
498
+ "learning_rate": 6.465070643642071e-05,
499
+ "loss": 3.7095,
500
+ "step": 6400
501
+ },
502
+ {
503
+ "epoch": 9.6,
504
+ "learning_rate": 6.441522762951334e-05,
505
+ "loss": 3.6481,
506
+ "step": 6500
507
+ },
508
+ {
509
+ "epoch": 9.6,
510
+ "eval_loss": 2.620758295059204,
511
+ "eval_runtime": 115.2407,
512
+ "eval_samples_per_second": 17.537,
513
+ "eval_steps_per_second": 2.195,
514
+ "eval_wer": 2.3537852548243445,
515
+ "step": 6500
516
+ },
517
+ {
518
+ "epoch": 9.75,
519
+ "learning_rate": 6.417974882260596e-05,
520
+ "loss": 3.6196,
521
+ "step": 6600
522
+ },
523
+ {
524
+ "epoch": 9.9,
525
+ "learning_rate": 6.394427001569859e-05,
526
+ "loss": 3.5941,
527
+ "step": 6700
528
+ },
529
+ {
530
+ "epoch": 10.04,
531
+ "learning_rate": 6.37087912087912e-05,
532
+ "loss": 3.5608,
533
+ "step": 6800
534
+ },
535
+ {
536
+ "epoch": 10.19,
537
+ "learning_rate": 6.347331240188383e-05,
538
+ "loss": 3.5296,
539
+ "step": 6900
540
+ },
541
+ {
542
+ "epoch": 10.34,
543
+ "learning_rate": 6.324018838304552e-05,
544
+ "loss": 3.4658,
545
+ "step": 7000
546
+ },
547
+ {
548
+ "epoch": 10.34,
549
+ "eval_loss": 2.3172152042388916,
550
+ "eval_runtime": 114.5436,
551
+ "eval_samples_per_second": 17.644,
552
+ "eval_steps_per_second": 2.209,
553
+ "eval_wer": 2.227115289460663,
554
+ "step": 7000
555
+ },
556
+ {
557
+ "epoch": 10.49,
558
+ "learning_rate": 6.300470957613814e-05,
559
+ "loss": 3.3977,
560
+ "step": 7100
561
+ },
562
+ {
563
+ "epoch": 10.63,
564
+ "learning_rate": 6.276923076923076e-05,
565
+ "loss": 3.3987,
566
+ "step": 7200
567
+ },
568
+ {
569
+ "epoch": 10.78,
570
+ "learning_rate": 6.253375196232339e-05,
571
+ "loss": 3.3587,
572
+ "step": 7300
573
+ },
574
+ {
575
+ "epoch": 10.93,
576
+ "learning_rate": 6.2298273155416e-05,
577
+ "loss": 3.2796,
578
+ "step": 7400
579
+ },
580
+ {
581
+ "epoch": 11.08,
582
+ "learning_rate": 6.206279434850863e-05,
583
+ "loss": 3.257,
584
+ "step": 7500
585
+ },
586
+ {
587
+ "epoch": 11.08,
588
+ "eval_loss": 2.0916049480438232,
589
+ "eval_runtime": 113.9408,
590
+ "eval_samples_per_second": 17.737,
591
+ "eval_steps_per_second": 2.22,
592
+ "eval_wer": 2.1350816427511132,
593
+ "step": 7500
594
+ },
595
+ {
596
+ "epoch": 11.23,
597
+ "learning_rate": 6.182731554160125e-05,
598
+ "loss": 3.2476,
599
+ "step": 7600
600
+ },
601
+ {
602
+ "epoch": 11.37,
603
+ "learning_rate": 6.159183673469388e-05,
604
+ "loss": 3.2463,
605
+ "step": 7700
606
+ },
607
+ {
608
+ "epoch": 11.52,
609
+ "learning_rate": 6.135635792778649e-05,
610
+ "loss": 3.2323,
611
+ "step": 7800
612
+ },
613
+ {
614
+ "epoch": 11.67,
615
+ "learning_rate": 6.112087912087912e-05,
616
+ "loss": 3.1674,
617
+ "step": 7900
618
+ },
619
+ {
620
+ "epoch": 11.82,
621
+ "learning_rate": 6.088540031397174e-05,
622
+ "loss": 3.1294,
623
+ "step": 8000
624
+ },
625
+ {
626
+ "epoch": 11.82,
627
+ "eval_loss": 1.895378828048706,
628
+ "eval_runtime": 115.1394,
629
+ "eval_samples_per_second": 17.553,
630
+ "eval_steps_per_second": 2.197,
631
+ "eval_wer": 2.2132607619990106,
632
+ "step": 8000
633
+ },
634
+ {
635
+ "epoch": 11.96,
636
+ "learning_rate": 6.0649921507064355e-05,
637
+ "loss": 3.1262,
638
+ "step": 8100
639
+ },
640
+ {
641
+ "epoch": 12.11,
642
+ "learning_rate": 6.041444270015698e-05,
643
+ "loss": 3.0377,
644
+ "step": 8200
645
+ },
646
+ {
647
+ "epoch": 12.26,
648
+ "learning_rate": 6.01789638932496e-05,
649
+ "loss": 3.0306,
650
+ "step": 8300
651
+ },
652
+ {
653
+ "epoch": 12.41,
654
+ "learning_rate": 5.994348508634223e-05,
655
+ "loss": 3.0425,
656
+ "step": 8400
657
+ },
658
+ {
659
+ "epoch": 12.56,
660
+ "learning_rate": 5.9710361067503915e-05,
661
+ "loss": 3.0266,
662
+ "step": 8500
663
+ },
664
+ {
665
+ "epoch": 12.56,
666
+ "eval_loss": 1.76727294921875,
667
+ "eval_runtime": 114.3494,
668
+ "eval_samples_per_second": 17.674,
669
+ "eval_steps_per_second": 2.213,
670
+ "eval_wer": 2.0895596239485403,
671
+ "step": 8500
672
+ },
673
+ {
674
+ "epoch": 12.7,
675
+ "learning_rate": 5.9474882260596537e-05,
676
+ "loss": 3.0398,
677
+ "step": 8600
678
+ },
679
+ {
680
+ "epoch": 12.85,
681
+ "learning_rate": 5.9239403453689165e-05,
682
+ "loss": 2.9985,
683
+ "step": 8700
684
+ },
685
+ {
686
+ "epoch": 13.0,
687
+ "learning_rate": 5.900392464678179e-05,
688
+ "loss": 2.9969,
689
+ "step": 8800
690
+ },
691
+ {
692
+ "epoch": 13.15,
693
+ "learning_rate": 5.876844583987441e-05,
694
+ "loss": 2.9648,
695
+ "step": 8900
696
+ },
697
+ {
698
+ "epoch": 13.29,
699
+ "learning_rate": 5.8532967032967024e-05,
700
+ "loss": 2.9451,
701
+ "step": 9000
702
+ },
703
+ {
704
+ "epoch": 13.29,
705
+ "eval_loss": 1.665855884552002,
706
+ "eval_runtime": 116.4877,
707
+ "eval_samples_per_second": 17.349,
708
+ "eval_steps_per_second": 2.172,
709
+ "eval_wer": 2.1380504700643246,
710
+ "step": 9000
711
+ },
712
+ {
713
+ "epoch": 13.44,
714
+ "learning_rate": 5.8297488226059645e-05,
715
+ "loss": 2.9573,
716
+ "step": 9100
717
+ },
718
+ {
719
+ "epoch": 13.59,
720
+ "learning_rate": 5.8062009419152274e-05,
721
+ "loss": 2.8819,
722
+ "step": 9200
723
+ },
724
+ {
725
+ "epoch": 13.74,
726
+ "learning_rate": 5.7826530612244896e-05,
727
+ "loss": 2.8901,
728
+ "step": 9300
729
+ },
730
+ {
731
+ "epoch": 13.88,
732
+ "learning_rate": 5.759105180533752e-05,
733
+ "loss": 2.8492,
734
+ "step": 9400
735
+ },
736
+ {
737
+ "epoch": 14.03,
738
+ "learning_rate": 5.735557299843013e-05,
739
+ "loss": 2.8802,
740
+ "step": 9500
741
+ },
742
+ {
743
+ "epoch": 14.03,
744
+ "eval_loss": 1.5637215375900269,
745
+ "eval_runtime": 115.1622,
746
+ "eval_samples_per_second": 17.549,
747
+ "eval_steps_per_second": 2.197,
748
+ "eval_wer": 2.1969322117763483,
749
+ "step": 9500
750
+ },
751
+ {
752
+ "epoch": 14.18,
753
+ "learning_rate": 5.7120094191522754e-05,
754
+ "loss": 2.8346,
755
+ "step": 9600
756
+ },
757
+ {
758
+ "epoch": 14.33,
759
+ "learning_rate": 5.6884615384615376e-05,
760
+ "loss": 2.8355,
761
+ "step": 9700
762
+ },
763
+ {
764
+ "epoch": 14.48,
765
+ "learning_rate": 5.6649136577708005e-05,
766
+ "loss": 2.8124,
767
+ "step": 9800
768
+ },
769
+ {
770
+ "epoch": 14.62,
771
+ "learning_rate": 5.6413657770800626e-05,
772
+ "loss": 2.7879,
773
+ "step": 9900
774
+ },
775
+ {
776
+ "epoch": 14.77,
777
+ "learning_rate": 5.617817896389324e-05,
778
+ "loss": 2.78,
779
+ "step": 10000
780
+ },
781
+ {
782
+ "epoch": 14.77,
783
+ "eval_loss": 1.4921427965164185,
784
+ "eval_runtime": 115.1,
785
+ "eval_samples_per_second": 17.559,
786
+ "eval_steps_per_second": 2.198,
787
+ "eval_wer": 2.2335477486392876,
788
+ "step": 10000
789
+ },
790
+ {
791
+ "epoch": 14.92,
792
+ "learning_rate": 5.594270015698586e-05,
793
+ "loss": 2.775,
794
+ "step": 10100
795
+ },
796
+ {
797
+ "epoch": 15.07,
798
+ "learning_rate": 5.5707221350078485e-05,
799
+ "loss": 2.7478,
800
+ "step": 10200
801
+ },
802
+ {
803
+ "epoch": 15.21,
804
+ "learning_rate": 5.5471742543171114e-05,
805
+ "loss": 2.7224,
806
+ "step": 10300
807
+ },
808
+ {
809
+ "epoch": 15.36,
810
+ "learning_rate": 5.5236263736263735e-05,
811
+ "loss": 2.7506,
812
+ "step": 10400
813
+ },
814
+ {
815
+ "epoch": 15.51,
816
+ "learning_rate": 5.500078492935635e-05,
817
+ "loss": 2.7049,
818
+ "step": 10500
819
+ },
820
+ {
821
+ "epoch": 15.51,
822
+ "eval_loss": 1.413183569908142,
823
+ "eval_runtime": 114.2743,
824
+ "eval_samples_per_second": 17.686,
825
+ "eval_steps_per_second": 2.214,
826
+ "eval_wer": 2.221672439386442,
827
+ "step": 10500
828
+ },
829
+ {
830
+ "epoch": 15.66,
831
+ "learning_rate": 5.476766091051805e-05,
832
+ "loss": 2.7145,
833
+ "step": 10600
834
+ },
835
+ {
836
+ "epoch": 15.8,
837
+ "learning_rate": 5.453218210361067e-05,
838
+ "loss": 2.6892,
839
+ "step": 10700
840
+ },
841
+ {
842
+ "epoch": 15.95,
843
+ "learning_rate": 5.4296703296703295e-05,
844
+ "loss": 2.69,
845
+ "step": 10800
846
+ },
847
+ {
848
+ "epoch": 16.1,
849
+ "learning_rate": 5.406122448979591e-05,
850
+ "loss": 2.623,
851
+ "step": 10900
852
+ },
853
+ {
854
+ "epoch": 16.25,
855
+ "learning_rate": 5.382574568288853e-05,
856
+ "loss": 2.6768,
857
+ "step": 11000
858
+ },
859
+ {
860
+ "epoch": 16.25,
861
+ "eval_loss": 1.3666878938674927,
862
+ "eval_runtime": 119.4402,
863
+ "eval_samples_per_second": 16.921,
864
+ "eval_steps_per_second": 2.118,
865
+ "eval_wer": 2.223156853043048,
866
+ "step": 11000
867
+ },
868
+ {
869
+ "epoch": 16.4,
870
+ "learning_rate": 5.359262166405023e-05,
871
+ "loss": 2.628,
872
+ "step": 11100
873
+ },
874
+ {
875
+ "epoch": 16.54,
876
+ "learning_rate": 5.3357142857142854e-05,
877
+ "loss": 2.6163,
878
+ "step": 11200
879
+ },
880
+ {
881
+ "epoch": 16.69,
882
+ "learning_rate": 5.312166405023547e-05,
883
+ "loss": 2.6193,
884
+ "step": 11300
885
+ },
886
+ {
887
+ "epoch": 16.84,
888
+ "learning_rate": 5.28861852433281e-05,
889
+ "loss": 2.6531,
890
+ "step": 11400
891
+ },
892
+ {
893
+ "epoch": 16.99,
894
+ "learning_rate": 5.265070643642072e-05,
895
+ "loss": 2.6358,
896
+ "step": 11500
897
+ },
898
+ {
899
+ "epoch": 16.99,
900
+ "eval_loss": 1.311090111732483,
901
+ "eval_runtime": 116.2157,
902
+ "eval_samples_per_second": 17.39,
903
+ "eval_steps_per_second": 2.177,
904
+ "eval_wer": 2.128649183572489,
905
+ "step": 11500
906
+ },
907
+ {
908
+ "epoch": 17.13,
909
+ "learning_rate": 5.241522762951334e-05,
910
+ "loss": 2.5748,
911
+ "step": 11600
912
+ },
913
+ {
914
+ "epoch": 17.28,
915
+ "learning_rate": 5.217974882260596e-05,
916
+ "loss": 2.6287,
917
+ "step": 11700
918
+ },
919
+ {
920
+ "epoch": 17.43,
921
+ "learning_rate": 5.194427001569858e-05,
922
+ "loss": 2.5583,
923
+ "step": 11800
924
+ },
925
+ {
926
+ "epoch": 17.58,
927
+ "learning_rate": 5.17087912087912e-05,
928
+ "loss": 2.5547,
929
+ "step": 11900
930
+ },
931
+ {
932
+ "epoch": 17.72,
933
+ "learning_rate": 5.147331240188383e-05,
934
+ "loss": 2.5802,
935
+ "step": 12000
936
+ },
937
+ {
938
+ "epoch": 17.72,
939
+ "eval_loss": 1.2678567171096802,
940
+ "eval_runtime": 116.076,
941
+ "eval_samples_per_second": 17.411,
942
+ "eval_steps_per_second": 2.18,
943
+ "eval_wer": 2.1429985155863434,
944
+ "step": 12000
945
+ },
946
+ {
947
+ "epoch": 17.87,
948
+ "learning_rate": 5.123783359497645e-05,
949
+ "loss": 2.557,
950
+ "step": 12100
951
+ },
952
+ {
953
+ "epoch": 18.02,
954
+ "learning_rate": 5.100235478806907e-05,
955
+ "loss": 2.5771,
956
+ "step": 12200
957
+ },
958
+ {
959
+ "epoch": 18.17,
960
+ "learning_rate": 5.076687598116169e-05,
961
+ "loss": 2.5393,
962
+ "step": 12300
963
+ },
964
+ {
965
+ "epoch": 18.32,
966
+ "learning_rate": 5.053375196232339e-05,
967
+ "loss": 2.5031,
968
+ "step": 12400
969
+ },
970
+ {
971
+ "epoch": 18.46,
972
+ "learning_rate": 5.029827315541601e-05,
973
+ "loss": 2.5012,
974
+ "step": 12500
975
+ },
976
+ {
977
+ "epoch": 18.46,
978
+ "eval_loss": 1.2365446090698242,
979
+ "eval_runtime": 116.0118,
980
+ "eval_samples_per_second": 17.421,
981
+ "eval_steps_per_second": 2.181,
982
+ "eval_wer": 2.115289460663038,
983
+ "step": 12500
984
+ },
985
+ {
986
+ "epoch": 18.61,
987
+ "learning_rate": 5.006279434850863e-05,
988
+ "loss": 2.54,
989
+ "step": 12600
990
+ },
991
+ {
992
+ "epoch": 18.76,
993
+ "learning_rate": 4.9827315541601246e-05,
994
+ "loss": 2.5072,
995
+ "step": 12700
996
+ },
997
+ {
998
+ "epoch": 18.91,
999
+ "learning_rate": 4.9591836734693875e-05,
1000
+ "loss": 2.4951,
1001
+ "step": 12800
1002
+ },
1003
+ {
1004
+ "epoch": 19.05,
1005
+ "learning_rate": 4.9356357927786497e-05,
1006
+ "loss": 2.4789,
1007
+ "step": 12900
1008
+ },
1009
+ {
1010
+ "epoch": 19.2,
1011
+ "learning_rate": 4.912087912087912e-05,
1012
+ "loss": 2.458,
1013
+ "step": 13000
1014
+ },
1015
+ {
1016
+ "epoch": 19.2,
1017
+ "eval_loss": 1.2117862701416016,
1018
+ "eval_runtime": 116.2579,
1019
+ "eval_samples_per_second": 17.384,
1020
+ "eval_steps_per_second": 2.176,
1021
+ "eval_wer": 2.1573478476001977,
1022
+ "step": 13000
1023
+ },
1024
+ {
1025
+ "epoch": 19.35,
1026
+ "learning_rate": 4.888540031397174e-05,
1027
+ "loss": 2.4616,
1028
+ "step": 13100
1029
+ },
1030
+ {
1031
+ "epoch": 19.5,
1032
+ "learning_rate": 4.8649921507064355e-05,
1033
+ "loss": 2.4739,
1034
+ "step": 13200
1035
+ },
1036
+ {
1037
+ "epoch": 19.65,
1038
+ "learning_rate": 4.8414442700156984e-05,
1039
+ "loss": 2.4867,
1040
+ "step": 13300
1041
+ },
1042
+ {
1043
+ "epoch": 19.79,
1044
+ "learning_rate": 4.8178963893249605e-05,
1045
+ "loss": 2.4568,
1046
+ "step": 13400
1047
+ },
1048
+ {
1049
+ "epoch": 19.94,
1050
+ "learning_rate": 4.794348508634223e-05,
1051
+ "loss": 2.4433,
1052
+ "step": 13500
1053
+ },
1054
+ {
1055
+ "epoch": 19.94,
1056
+ "eval_loss": 1.1991767883300781,
1057
+ "eval_runtime": 114.5641,
1058
+ "eval_samples_per_second": 17.641,
1059
+ "eval_steps_per_second": 2.208,
1060
+ "eval_wer": 2.1335972290945078,
1061
+ "step": 13500
1062
+ },
1063
+ {
1064
+ "epoch": 20.09,
1065
+ "learning_rate": 4.770800627943485e-05,
1066
+ "loss": 2.4532,
1067
+ "step": 13600
1068
+ },
1069
+ {
1070
+ "epoch": 20.24,
1071
+ "learning_rate": 4.7472527472527464e-05,
1072
+ "loss": 2.3913,
1073
+ "step": 13700
1074
+ },
1075
+ {
1076
+ "epoch": 20.38,
1077
+ "learning_rate": 4.7237048665620086e-05,
1078
+ "loss": 2.421,
1079
+ "step": 13800
1080
+ },
1081
+ {
1082
+ "epoch": 20.53,
1083
+ "learning_rate": 4.7001569858712714e-05,
1084
+ "loss": 2.4526,
1085
+ "step": 13900
1086
+ },
1087
+ {
1088
+ "epoch": 20.68,
1089
+ "learning_rate": 4.6766091051805336e-05,
1090
+ "loss": 2.438,
1091
+ "step": 14000
1092
+ },
1093
+ {
1094
+ "epoch": 20.68,
1095
+ "eval_loss": 1.180332064628601,
1096
+ "eval_runtime": 116.5012,
1097
+ "eval_samples_per_second": 17.347,
1098
+ "eval_steps_per_second": 2.172,
1099
+ "eval_wer": 2.1509153884215735,
1100
+ "step": 14000
1101
+ },
1102
+ {
1103
+ "epoch": 20.83,
1104
+ "learning_rate": 4.653061224489796e-05,
1105
+ "loss": 2.4034,
1106
+ "step": 14100
1107
+ },
1108
+ {
1109
+ "epoch": 20.97,
1110
+ "learning_rate": 4.629513343799057e-05,
1111
+ "loss": 2.4306,
1112
+ "step": 14200
1113
+ },
1114
+ {
1115
+ "epoch": 21.12,
1116
+ "learning_rate": 4.6059654631083195e-05,
1117
+ "loss": 2.4145,
1118
+ "step": 14300
1119
+ },
1120
+ {
1121
+ "epoch": 21.27,
1122
+ "learning_rate": 4.582417582417582e-05,
1123
+ "loss": 2.4677,
1124
+ "step": 14400
1125
+ },
1126
+ {
1127
+ "epoch": 21.42,
1128
+ "learning_rate": 4.5588697017268445e-05,
1129
+ "loss": 2.418,
1130
+ "step": 14500
1131
+ },
1132
+ {
1133
+ "epoch": 21.42,
1134
+ "eval_loss": 1.1601430177688599,
1135
+ "eval_runtime": 114.5652,
1136
+ "eval_samples_per_second": 17.641,
1137
+ "eval_steps_per_second": 2.208,
1138
+ "eval_wer": 2.1232063334982683,
1139
+ "step": 14500
1140
+ },
1141
+ {
1142
+ "epoch": 21.57,
1143
+ "learning_rate": 4.535321821036107e-05,
1144
+ "loss": 2.3967,
1145
+ "step": 14600
1146
+ },
1147
+ {
1148
+ "epoch": 21.71,
1149
+ "learning_rate": 4.511773940345368e-05,
1150
+ "loss": 2.3939,
1151
+ "step": 14700
1152
+ },
1153
+ {
1154
+ "epoch": 21.86,
1155
+ "learning_rate": 4.4882260596546304e-05,
1156
+ "loss": 2.3925,
1157
+ "step": 14800
1158
+ },
1159
+ {
1160
+ "epoch": 22.01,
1161
+ "learning_rate": 4.4646781789638925e-05,
1162
+ "loss": 2.3596,
1163
+ "step": 14900
1164
+ },
1165
+ {
1166
+ "epoch": 22.16,
1167
+ "learning_rate": 4.4411302982731554e-05,
1168
+ "loss": 2.3322,
1169
+ "step": 15000
1170
+ },
1171
+ {
1172
+ "epoch": 22.16,
1173
+ "eval_loss": 1.1417704820632935,
1174
+ "eval_runtime": 116.2111,
1175
+ "eval_samples_per_second": 17.391,
1176
+ "eval_steps_per_second": 2.177,
1177
+ "eval_wer": 2.1929737753587335,
1178
+ "step": 15000
1179
+ },
1180
+ {
1181
+ "epoch": 22.3,
1182
+ "learning_rate": 4.4175824175824176e-05,
1183
+ "loss": 2.3821,
1184
+ "step": 15100
1185
+ },
1186
+ {
1187
+ "epoch": 22.45,
1188
+ "learning_rate": 4.394034536891679e-05,
1189
+ "loss": 2.3435,
1190
+ "step": 15200
1191
+ },
1192
+ {
1193
+ "epoch": 22.6,
1194
+ "learning_rate": 4.370486656200941e-05,
1195
+ "loss": 2.3542,
1196
+ "step": 15300
1197
+ },
1198
+ {
1199
+ "epoch": 22.75,
1200
+ "learning_rate": 4.3469387755102034e-05,
1201
+ "loss": 2.3469,
1202
+ "step": 15400
1203
+ },
1204
+ {
1205
+ "epoch": 22.89,
1206
+ "learning_rate": 4.323390894819466e-05,
1207
+ "loss": 2.3387,
1208
+ "step": 15500
1209
+ },
1210
+ {
1211
+ "epoch": 22.89,
1212
+ "eval_loss": 1.1172302961349487,
1213
+ "eval_runtime": 114.3169,
1214
+ "eval_samples_per_second": 17.679,
1215
+ "eval_steps_per_second": 2.213,
1216
+ "eval_wer": 2.2464126669965365,
1217
+ "step": 15500
1218
+ },
1219
+ {
1220
+ "epoch": 23.04,
1221
+ "learning_rate": 4.2998430141287285e-05,
1222
+ "loss": 2.3688,
1223
+ "step": 15600
1224
+ },
1225
+ {
1226
+ "epoch": 23.19,
1227
+ "learning_rate": 4.27629513343799e-05,
1228
+ "loss": 2.3344,
1229
+ "step": 15700
1230
+ },
1231
+ {
1232
+ "epoch": 23.34,
1233
+ "learning_rate": 4.252747252747252e-05,
1234
+ "loss": 2.3245,
1235
+ "step": 15800
1236
+ },
1237
+ {
1238
+ "epoch": 23.49,
1239
+ "learning_rate": 4.229199372056514e-05,
1240
+ "loss": 2.3523,
1241
+ "step": 15900
1242
+ },
1243
+ {
1244
+ "epoch": 23.63,
1245
+ "learning_rate": 4.205651491365777e-05,
1246
+ "loss": 2.3349,
1247
+ "step": 16000
1248
+ },
1249
+ {
1250
+ "epoch": 23.63,
1251
+ "eval_loss": 1.1144375801086426,
1252
+ "eval_runtime": 116.2412,
1253
+ "eval_samples_per_second": 17.386,
1254
+ "eval_steps_per_second": 2.177,
1255
+ "eval_wer": 2.185551707075705,
1256
+ "step": 16000
1257
+ },
1258
+ {
1259
+ "epoch": 23.78,
1260
+ "learning_rate": 4.1821036106750393e-05,
1261
+ "loss": 2.2847,
1262
+ "step": 16100
1263
+ },
1264
+ {
1265
+ "epoch": 23.93,
1266
+ "learning_rate": 4.158555729984301e-05,
1267
+ "loss": 2.3303,
1268
+ "step": 16200
1269
+ },
1270
+ {
1271
+ "epoch": 24.08,
1272
+ "learning_rate": 4.135007849293563e-05,
1273
+ "loss": 2.2994,
1274
+ "step": 16300
1275
+ },
1276
+ {
1277
+ "epoch": 24.22,
1278
+ "learning_rate": 4.111459968602825e-05,
1279
+ "loss": 2.2887,
1280
+ "step": 16400
1281
+ },
1282
+ {
1283
+ "epoch": 24.37,
1284
+ "learning_rate": 4.0879120879120874e-05,
1285
+ "loss": 2.291,
1286
+ "step": 16500
1287
+ },
1288
+ {
1289
+ "epoch": 24.37,
1290
+ "eval_loss": 1.1018128395080566,
1291
+ "eval_runtime": 114.9042,
1292
+ "eval_samples_per_second": 17.589,
1293
+ "eval_steps_per_second": 2.202,
1294
+ "eval_wer": 2.1929737753587335,
1295
+ "step": 16500
1296
+ },
1297
+ {
1298
+ "epoch": 24.52,
1299
+ "learning_rate": 4.06436420722135e-05,
1300
+ "loss": 2.2888,
1301
+ "step": 16600
1302
+ },
1303
+ {
1304
+ "epoch": 24.67,
1305
+ "learning_rate": 4.040816326530612e-05,
1306
+ "loss": 2.2724,
1307
+ "step": 16700
1308
+ },
1309
+ {
1310
+ "epoch": 24.82,
1311
+ "learning_rate": 4.017268445839874e-05,
1312
+ "loss": 2.2922,
1313
+ "step": 16800
1314
+ },
1315
+ {
1316
+ "epoch": 24.96,
1317
+ "learning_rate": 3.993720565149136e-05,
1318
+ "loss": 2.2934,
1319
+ "step": 16900
1320
+ },
1321
+ {
1322
+ "epoch": 25.11,
1323
+ "learning_rate": 3.970172684458398e-05,
1324
+ "loss": 2.2766,
1325
+ "step": 17000
1326
+ },
1327
+ {
1328
+ "epoch": 25.11,
1329
+ "eval_loss": 1.0882744789123535,
1330
+ "eval_runtime": 117.2941,
1331
+ "eval_samples_per_second": 17.23,
1332
+ "eval_steps_per_second": 2.157,
1333
+ "eval_wer": 2.1761504205838693,
1334
+ "step": 17000
1335
+ },
1336
+ {
1337
+ "epoch": 25.26,
1338
+ "learning_rate": 3.946624803767661e-05,
1339
+ "loss": 2.2656,
1340
+ "step": 17100
1341
+ },
1342
+ {
1343
+ "epoch": 25.41,
1344
+ "learning_rate": 3.9230769230769226e-05,
1345
+ "loss": 2.2929,
1346
+ "step": 17200
1347
+ },
1348
+ {
1349
+ "epoch": 25.55,
1350
+ "learning_rate": 3.899529042386185e-05,
1351
+ "loss": 2.2513,
1352
+ "step": 17300
1353
+ },
1354
+ {
1355
+ "epoch": 25.7,
1356
+ "learning_rate": 3.875981161695447e-05,
1357
+ "loss": 2.2603,
1358
+ "step": 17400
1359
+ },
1360
+ {
1361
+ "epoch": 25.85,
1362
+ "learning_rate": 3.852433281004709e-05,
1363
+ "loss": 2.2534,
1364
+ "step": 17500
1365
+ },
1366
+ {
1367
+ "epoch": 25.85,
1368
+ "eval_loss": 1.0743526220321655,
1369
+ "eval_runtime": 118.2043,
1370
+ "eval_samples_per_second": 17.098,
1371
+ "eval_steps_per_second": 2.14,
1372
+ "eval_wer": 2.1875309252845128,
1373
+ "step": 17500
1374
+ },
1375
+ {
1376
+ "epoch": 26.0,
1377
+ "learning_rate": 3.8288854003139713e-05,
1378
+ "loss": 2.2716,
1379
+ "step": 17600
1380
+ },
1381
+ {
1382
+ "epoch": 26.14,
1383
+ "learning_rate": 3.8053375196232335e-05,
1384
+ "loss": 2.2486,
1385
+ "step": 17700
1386
+ },
1387
+ {
1388
+ "epoch": 26.29,
1389
+ "learning_rate": 3.781789638932496e-05,
1390
+ "loss": 2.2068,
1391
+ "step": 17800
1392
+ },
1393
+ {
1394
+ "epoch": 26.44,
1395
+ "learning_rate": 3.758241758241758e-05,
1396
+ "loss": 2.2431,
1397
+ "step": 17900
1398
+ },
1399
+ {
1400
+ "epoch": 26.59,
1401
+ "learning_rate": 3.73469387755102e-05,
1402
+ "loss": 2.2393,
1403
+ "step": 18000
1404
+ },
1405
+ {
1406
+ "epoch": 26.59,
1407
+ "eval_loss": 1.0561192035675049,
1408
+ "eval_runtime": 116.8996,
1409
+ "eval_samples_per_second": 17.288,
1410
+ "eval_steps_per_second": 2.164,
1411
+ "eval_wer": 2.1845620979713014,
1412
+ "step": 18000
1413
+ },
1414
+ {
1415
+ "epoch": 26.74,
1416
+ "learning_rate": 3.711145996860282e-05,
1417
+ "loss": 2.1944,
1418
+ "step": 18100
1419
+ },
1420
+ {
1421
+ "epoch": 26.88,
1422
+ "learning_rate": 3.6875981161695444e-05,
1423
+ "loss": 2.2359,
1424
+ "step": 18200
1425
+ },
1426
+ {
1427
+ "epoch": 27.03,
1428
+ "learning_rate": 3.664285714285714e-05,
1429
+ "loss": 2.2097,
1430
+ "step": 18300
1431
+ },
1432
+ {
1433
+ "epoch": 27.18,
1434
+ "learning_rate": 3.640737833594976e-05,
1435
+ "loss": 2.1431,
1436
+ "step": 18400
1437
+ },
1438
+ {
1439
+ "epoch": 27.33,
1440
+ "learning_rate": 3.617189952904238e-05,
1441
+ "loss": 2.2085,
1442
+ "step": 18500
1443
+ },
1444
+ {
1445
+ "epoch": 27.33,
1446
+ "eval_loss": 1.0465816259384155,
1447
+ "eval_runtime": 115.87,
1448
+ "eval_samples_per_second": 17.442,
1449
+ "eval_steps_per_second": 2.183,
1450
+ "eval_wer": 2.1444829292429493,
1451
+ "step": 18500
1452
+ },
1453
+ {
1454
+ "epoch": 27.47,
1455
+ "learning_rate": 3.5936420722135003e-05,
1456
+ "loss": 2.2204,
1457
+ "step": 18600
1458
+ },
1459
+ {
1460
+ "epoch": 27.62,
1461
+ "learning_rate": 3.5700941915227625e-05,
1462
+ "loss": 2.242,
1463
+ "step": 18700
1464
+ },
1465
+ {
1466
+ "epoch": 27.77,
1467
+ "learning_rate": 3.546546310832025e-05,
1468
+ "loss": 2.1699,
1469
+ "step": 18800
1470
+ },
1471
+ {
1472
+ "epoch": 27.92,
1473
+ "learning_rate": 3.522998430141287e-05,
1474
+ "loss": 2.2152,
1475
+ "step": 18900
1476
+ },
1477
+ {
1478
+ "epoch": 28.06,
1479
+ "learning_rate": 3.499450549450549e-05,
1480
+ "loss": 2.1966,
1481
+ "step": 19000
1482
+ },
1483
+ {
1484
+ "epoch": 28.06,
1485
+ "eval_loss": 1.0382250547409058,
1486
+ "eval_runtime": 116.4655,
1487
+ "eval_samples_per_second": 17.353,
1488
+ "eval_steps_per_second": 2.172,
1489
+ "eval_wer": 2.1088570014844135,
1490
+ "step": 19000
1491
+ },
1492
+ {
1493
+ "epoch": 28.21,
1494
+ "learning_rate": 3.475902668759811e-05,
1495
+ "loss": 2.169,
1496
+ "step": 19100
1497
+ },
1498
+ {
1499
+ "epoch": 28.36,
1500
+ "learning_rate": 3.4523547880690734e-05,
1501
+ "loss": 2.1981,
1502
+ "step": 19200
1503
+ },
1504
+ {
1505
+ "epoch": 28.51,
1506
+ "learning_rate": 3.4288069073783356e-05,
1507
+ "loss": 2.1692,
1508
+ "step": 19300
1509
+ },
1510
+ {
1511
+ "epoch": 28.66,
1512
+ "learning_rate": 3.405259026687598e-05,
1513
+ "loss": 2.1931,
1514
+ "step": 19400
1515
+ },
1516
+ {
1517
+ "epoch": 28.8,
1518
+ "learning_rate": 3.38171114599686e-05,
1519
+ "loss": 2.1794,
1520
+ "step": 19500
1521
+ },
1522
+ {
1523
+ "epoch": 28.8,
1524
+ "eval_loss": 1.0263785123825073,
1525
+ "eval_runtime": 114.5988,
1526
+ "eval_samples_per_second": 17.635,
1527
+ "eval_steps_per_second": 2.208,
1528
+ "eval_wer": 1.9861454725383474,
1529
+ "step": 19500
1530
+ },
1531
+ {
1532
+ "epoch": 28.95,
1533
+ "learning_rate": 3.358163265306122e-05,
1534
+ "loss": 2.1638,
1535
+ "step": 19600
1536
+ },
1537
+ {
1538
+ "epoch": 29.1,
1539
+ "learning_rate": 3.334615384615384e-05,
1540
+ "loss": 2.1714,
1541
+ "step": 19700
1542
+ },
1543
+ {
1544
+ "epoch": 29.25,
1545
+ "learning_rate": 3.3110675039246465e-05,
1546
+ "loss": 2.1514,
1547
+ "step": 19800
1548
+ },
1549
+ {
1550
+ "epoch": 29.39,
1551
+ "learning_rate": 3.2875196232339087e-05,
1552
+ "loss": 2.1374,
1553
+ "step": 19900
1554
+ },
1555
+ {
1556
+ "epoch": 29.54,
1557
+ "learning_rate": 3.263971742543171e-05,
1558
+ "loss": 2.1423,
1559
+ "step": 20000
1560
+ },
1561
+ {
1562
+ "epoch": 29.54,
1563
+ "eval_loss": 1.0245550870895386,
1564
+ "eval_runtime": 116.8375,
1565
+ "eval_samples_per_second": 17.298,
1566
+ "eval_steps_per_second": 2.165,
1567
+ "eval_wer": 1.9678377041068777,
1568
+ "step": 20000
1569
+ },
1570
+ {
1571
+ "epoch": 29.69,
1572
+ "learning_rate": 3.240423861852433e-05,
1573
+ "loss": 2.1807,
1574
+ "step": 20100
1575
+ },
1576
+ {
1577
+ "epoch": 29.84,
1578
+ "learning_rate": 3.216875981161695e-05,
1579
+ "loss": 2.1545,
1580
+ "step": 20200
1581
+ },
1582
+ {
1583
+ "epoch": 29.98,
1584
+ "learning_rate": 3.1933281004709574e-05,
1585
+ "loss": 2.1404,
1586
+ "step": 20300
1587
+ },
1588
+ {
1589
+ "epoch": 30.13,
1590
+ "learning_rate": 3.1697802197802195e-05,
1591
+ "loss": 2.1089,
1592
+ "step": 20400
1593
+ },
1594
+ {
1595
+ "epoch": 30.28,
1596
+ "learning_rate": 3.146232339089482e-05,
1597
+ "loss": 2.1649,
1598
+ "step": 20500
1599
+ },
1600
+ {
1601
+ "epoch": 30.28,
1602
+ "eval_loss": 0.9981661438941956,
1603
+ "eval_runtime": 116.056,
1604
+ "eval_samples_per_second": 17.414,
1605
+ "eval_steps_per_second": 2.18,
1606
+ "eval_wer": 2.000494804552202,
1607
+ "step": 20500
1608
+ },
1609
+ {
1610
+ "epoch": 30.43,
1611
+ "learning_rate": 3.122684458398744e-05,
1612
+ "loss": 2.1425,
1613
+ "step": 20600
1614
+ },
1615
+ {
1616
+ "epoch": 30.58,
1617
+ "learning_rate": 3.099136577708006e-05,
1618
+ "loss": 2.1357,
1619
+ "step": 20700
1620
+ },
1621
+ {
1622
+ "epoch": 30.72,
1623
+ "learning_rate": 3.0758241758241755e-05,
1624
+ "loss": 2.1251,
1625
+ "step": 20800
1626
+ },
1627
+ {
1628
+ "epoch": 30.87,
1629
+ "learning_rate": 3.052276295133438e-05,
1630
+ "loss": 2.1256,
1631
+ "step": 20900
1632
+ },
1633
+ {
1634
+ "epoch": 31.02,
1635
+ "learning_rate": 3.0287284144427e-05,
1636
+ "loss": 2.143,
1637
+ "step": 21000
1638
+ },
1639
+ {
1640
+ "epoch": 31.02,
1641
+ "eval_loss": 0.9985482692718506,
1642
+ "eval_runtime": 116.0424,
1643
+ "eval_samples_per_second": 17.416,
1644
+ "eval_steps_per_second": 2.18,
1645
+ "eval_wer": 2.045027214250371,
1646
+ "step": 21000
1647
+ },
1648
+ {
1649
+ "epoch": 31.17,
1650
+ "learning_rate": 3.005180533751962e-05,
1651
+ "loss": 2.0744,
1652
+ "step": 21100
1653
+ },
1654
+ {
1655
+ "epoch": 31.31,
1656
+ "learning_rate": 2.9816326530612242e-05,
1657
+ "loss": 2.0831,
1658
+ "step": 21200
1659
+ },
1660
+ {
1661
+ "epoch": 31.46,
1662
+ "learning_rate": 2.9583202511773936e-05,
1663
+ "loss": 2.1254,
1664
+ "step": 21300
1665
+ },
1666
+ {
1667
+ "epoch": 31.61,
1668
+ "learning_rate": 2.934772370486656e-05,
1669
+ "loss": 2.1357,
1670
+ "step": 21400
1671
+ },
1672
+ {
1673
+ "epoch": 31.76,
1674
+ "learning_rate": 2.911224489795918e-05,
1675
+ "loss": 2.1338,
1676
+ "step": 21500
1677
+ },
1678
+ {
1679
+ "epoch": 31.76,
1680
+ "eval_loss": 0.9932034611701965,
1681
+ "eval_runtime": 114.6961,
1682
+ "eval_samples_per_second": 17.62,
1683
+ "eval_steps_per_second": 2.206,
1684
+ "eval_wer": 2.0024740227610094,
1685
+ "step": 21500
1686
+ },
1687
+ {
1688
+ "epoch": 31.91,
1689
+ "learning_rate": 2.8876766091051805e-05,
1690
+ "loss": 2.1053,
1691
+ "step": 21600
1692
+ },
1693
+ {
1694
+ "epoch": 32.05,
1695
+ "learning_rate": 2.8641287284144426e-05,
1696
+ "loss": 2.1111,
1697
+ "step": 21700
1698
+ },
1699
+ {
1700
+ "epoch": 32.2,
1701
+ "learning_rate": 2.8405808477237045e-05,
1702
+ "loss": 2.1028,
1703
+ "step": 21800
1704
+ },
1705
+ {
1706
+ "epoch": 32.35,
1707
+ "learning_rate": 2.817032967032967e-05,
1708
+ "loss": 2.0879,
1709
+ "step": 21900
1710
+ },
1711
+ {
1712
+ "epoch": 32.5,
1713
+ "learning_rate": 2.793485086342229e-05,
1714
+ "loss": 2.1076,
1715
+ "step": 22000
1716
+ },
1717
+ {
1718
+ "epoch": 32.5,
1719
+ "eval_loss": 0.9902665019035339,
1720
+ "eval_runtime": 120.6987,
1721
+ "eval_samples_per_second": 16.744,
1722
+ "eval_steps_per_second": 2.096,
1723
+ "eval_wer": 2.0504700643245917,
1724
+ "step": 22000
1725
+ },
1726
+ {
1727
+ "epoch": 32.64,
1728
+ "learning_rate": 2.769937205651491e-05,
1729
+ "loss": 2.1107,
1730
+ "step": 22100
1731
+ },
1732
+ {
1733
+ "epoch": 32.79,
1734
+ "learning_rate": 2.7463893249607535e-05,
1735
+ "loss": 2.0953,
1736
+ "step": 22200
1737
+ },
1738
+ {
1739
+ "epoch": 32.94,
1740
+ "learning_rate": 2.7228414442700154e-05,
1741
+ "loss": 2.0619,
1742
+ "step": 22300
1743
+ },
1744
+ {
1745
+ "epoch": 33.09,
1746
+ "learning_rate": 2.6992935635792776e-05,
1747
+ "loss": 2.0531,
1748
+ "step": 22400
1749
+ },
1750
+ {
1751
+ "epoch": 33.23,
1752
+ "learning_rate": 2.6757456828885397e-05,
1753
+ "loss": 2.0519,
1754
+ "step": 22500
1755
+ },
1756
+ {
1757
+ "epoch": 33.23,
1758
+ "eval_loss": 0.9833839535713196,
1759
+ "eval_runtime": 116.5317,
1760
+ "eval_samples_per_second": 17.343,
1761
+ "eval_steps_per_second": 2.171,
1762
+ "eval_wer": 2.07372587827808,
1763
+ "step": 22500
1764
+ },
1765
+ {
1766
+ "epoch": 33.38,
1767
+ "learning_rate": 2.652197802197802e-05,
1768
+ "loss": 2.0493,
1769
+ "step": 22600
1770
+ },
1771
+ {
1772
+ "epoch": 33.53,
1773
+ "learning_rate": 2.6286499215070644e-05,
1774
+ "loss": 2.0749,
1775
+ "step": 22700
1776
+ },
1777
+ {
1778
+ "epoch": 33.68,
1779
+ "learning_rate": 2.6051020408163263e-05,
1780
+ "loss": 2.0838,
1781
+ "step": 22800
1782
+ },
1783
+ {
1784
+ "epoch": 33.83,
1785
+ "learning_rate": 2.5815541601255884e-05,
1786
+ "loss": 2.0629,
1787
+ "step": 22900
1788
+ },
1789
+ {
1790
+ "epoch": 33.97,
1791
+ "learning_rate": 2.5580062794348506e-05,
1792
+ "loss": 2.0534,
1793
+ "step": 23000
1794
+ },
1795
+ {
1796
+ "epoch": 33.97,
1797
+ "eval_loss": 0.9755652546882629,
1798
+ "eval_runtime": 114.923,
1799
+ "eval_samples_per_second": 17.586,
1800
+ "eval_steps_per_second": 2.201,
1801
+ "eval_wer": 2.024740227610094,
1802
+ "step": 23000
1803
+ },
1804
+ {
1805
+ "epoch": 34.12,
1806
+ "learning_rate": 2.5344583987441128e-05,
1807
+ "loss": 2.067,
1808
+ "step": 23100
1809
+ },
1810
+ {
1811
+ "epoch": 34.27,
1812
+ "learning_rate": 2.5109105180533746e-05,
1813
+ "loss": 2.0252,
1814
+ "step": 23200
1815
+ },
1816
+ {
1817
+ "epoch": 34.42,
1818
+ "learning_rate": 2.487362637362637e-05,
1819
+ "loss": 2.0483,
1820
+ "step": 23300
1821
+ },
1822
+ {
1823
+ "epoch": 34.56,
1824
+ "learning_rate": 2.4638147566718993e-05,
1825
+ "loss": 2.0464,
1826
+ "step": 23400
1827
+ },
1828
+ {
1829
+ "epoch": 34.71,
1830
+ "learning_rate": 2.4402668759811615e-05,
1831
+ "loss": 2.0121,
1832
+ "step": 23500
1833
+ },
1834
+ {
1835
+ "epoch": 34.71,
1836
+ "eval_loss": 0.968792736530304,
1837
+ "eval_runtime": 114.3088,
1838
+ "eval_samples_per_second": 17.68,
1839
+ "eval_steps_per_second": 2.213,
1840
+ "eval_wer": 2.1439881246907473,
1841
+ "step": 23500
1842
+ },
1843
+ {
1844
+ "epoch": 34.86,
1845
+ "learning_rate": 2.4167189952904237e-05,
1846
+ "loss": 2.036,
1847
+ "step": 23600
1848
+ },
1849
+ {
1850
+ "epoch": 35.01,
1851
+ "learning_rate": 2.3931711145996855e-05,
1852
+ "loss": 2.013,
1853
+ "step": 23700
1854
+ },
1855
+ {
1856
+ "epoch": 35.16,
1857
+ "learning_rate": 2.369623233908948e-05,
1858
+ "loss": 2.0043,
1859
+ "step": 23800
1860
+ },
1861
+ {
1862
+ "epoch": 35.3,
1863
+ "learning_rate": 2.3460753532182102e-05,
1864
+ "loss": 2.037,
1865
+ "step": 23900
1866
+ },
1867
+ {
1868
+ "epoch": 35.45,
1869
+ "learning_rate": 2.322527472527472e-05,
1870
+ "loss": 2.0161,
1871
+ "step": 24000
1872
+ },
1873
+ {
1874
+ "epoch": 35.45,
1875
+ "eval_loss": 0.9581586718559265,
1876
+ "eval_runtime": 115.925,
1877
+ "eval_samples_per_second": 17.434,
1878
+ "eval_steps_per_second": 2.182,
1879
+ "eval_wer": 2.1232063334982683,
1880
+ "step": 24000
1881
+ },
1882
+ {
1883
+ "epoch": 35.6,
1884
+ "learning_rate": 2.2989795918367346e-05,
1885
+ "loss": 2.0256,
1886
+ "step": 24100
1887
+ },
1888
+ {
1889
+ "epoch": 35.75,
1890
+ "learning_rate": 2.2754317111459968e-05,
1891
+ "loss": 2.0265,
1892
+ "step": 24200
1893
+ },
1894
+ {
1895
+ "epoch": 35.89,
1896
+ "learning_rate": 2.251883830455259e-05,
1897
+ "loss": 2.0298,
1898
+ "step": 24300
1899
+ },
1900
+ {
1901
+ "epoch": 36.04,
1902
+ "learning_rate": 2.228335949764521e-05,
1903
+ "loss": 2.0028,
1904
+ "step": 24400
1905
+ },
1906
+ {
1907
+ "epoch": 36.19,
1908
+ "learning_rate": 2.204788069073783e-05,
1909
+ "loss": 2.0178,
1910
+ "step": 24500
1911
+ },
1912
+ {
1913
+ "epoch": 36.19,
1914
+ "eval_loss": 0.9480372071266174,
1915
+ "eval_runtime": 116.8212,
1916
+ "eval_samples_per_second": 17.3,
1917
+ "eval_steps_per_second": 2.166,
1918
+ "eval_wer": 2.0895596239485403,
1919
+ "step": 24500
1920
+ },
1921
+ {
1922
+ "epoch": 36.34,
1923
+ "learning_rate": 2.1812401883830455e-05,
1924
+ "loss": 2.008,
1925
+ "step": 24600
1926
+ },
1927
+ {
1928
+ "epoch": 36.48,
1929
+ "learning_rate": 2.1576923076923076e-05,
1930
+ "loss": 2.0132,
1931
+ "step": 24700
1932
+ },
1933
+ {
1934
+ "epoch": 36.63,
1935
+ "learning_rate": 2.1341444270015695e-05,
1936
+ "loss": 2.0204,
1937
+ "step": 24800
1938
+ },
1939
+ {
1940
+ "epoch": 36.78,
1941
+ "learning_rate": 2.110596546310832e-05,
1942
+ "loss": 1.9806,
1943
+ "step": 24900
1944
+ },
1945
+ {
1946
+ "epoch": 36.93,
1947
+ "learning_rate": 2.087048665620094e-05,
1948
+ "loss": 2.0154,
1949
+ "step": 25000
1950
+ },
1951
+ {
1952
+ "epoch": 36.93,
1953
+ "eval_loss": 0.9483017325401306,
1954
+ "eval_runtime": 117.4294,
1955
+ "eval_samples_per_second": 17.21,
1956
+ "eval_steps_per_second": 2.154,
1957
+ "eval_wer": 2.078673923800099,
1958
+ "step": 25000
1959
+ },
1960
+ {
1961
+ "epoch": 37.08,
1962
+ "learning_rate": 2.063500784929356e-05,
1963
+ "loss": 1.997,
1964
+ "step": 25100
1965
+ },
1966
+ {
1967
+ "epoch": 37.22,
1968
+ "learning_rate": 2.0399529042386185e-05,
1969
+ "loss": 1.9712,
1970
+ "step": 25200
1971
+ },
1972
+ {
1973
+ "epoch": 37.37,
1974
+ "learning_rate": 2.0164050235478804e-05,
1975
+ "loss": 2.0131,
1976
+ "step": 25300
1977
+ },
1978
+ {
1979
+ "epoch": 37.52,
1980
+ "learning_rate": 1.992857142857143e-05,
1981
+ "loss": 1.9605,
1982
+ "step": 25400
1983
+ },
1984
+ {
1985
+ "epoch": 37.67,
1986
+ "learning_rate": 1.9695447409733123e-05,
1987
+ "loss": 1.9966,
1988
+ "step": 25500
1989
+ },
1990
+ {
1991
+ "epoch": 37.67,
1992
+ "eval_loss": 0.940608024597168,
1993
+ "eval_runtime": 115.2635,
1994
+ "eval_samples_per_second": 17.534,
1995
+ "eval_steps_per_second": 2.195,
1996
+ "eval_wer": 2.0296882731321126,
1997
+ "step": 25500
1998
+ },
1999
+ {
2000
+ "epoch": 37.81,
2001
+ "learning_rate": 1.945996860282574e-05,
2002
+ "loss": 1.9879,
2003
+ "step": 25600
2004
+ },
2005
+ {
2006
+ "epoch": 37.96,
2007
+ "learning_rate": 1.9224489795918367e-05,
2008
+ "loss": 1.9836,
2009
+ "step": 25700
2010
+ },
2011
+ {
2012
+ "epoch": 38.11,
2013
+ "learning_rate": 1.8989010989010988e-05,
2014
+ "loss": 1.9872,
2015
+ "step": 25800
2016
+ },
2017
+ {
2018
+ "epoch": 38.26,
2019
+ "learning_rate": 1.8753532182103607e-05,
2020
+ "loss": 1.9684,
2021
+ "step": 25900
2022
+ },
2023
+ {
2024
+ "epoch": 38.4,
2025
+ "learning_rate": 1.851805337519623e-05,
2026
+ "loss": 1.9753,
2027
+ "step": 26000
2028
+ },
2029
+ {
2030
+ "epoch": 38.4,
2031
+ "eval_loss": 0.9418594837188721,
2032
+ "eval_runtime": 115.7124,
2033
+ "eval_samples_per_second": 17.466,
2034
+ "eval_steps_per_second": 2.186,
2035
+ "eval_wer": 2.0346363186541314,
2036
+ "step": 26000
2037
+ },
2038
+ {
2039
+ "epoch": 38.55,
2040
+ "learning_rate": 1.828257456828885e-05,
2041
+ "loss": 1.9926,
2042
+ "step": 26100
2043
+ },
2044
+ {
2045
+ "epoch": 38.7,
2046
+ "learning_rate": 1.8047095761381475e-05,
2047
+ "loss": 1.9685,
2048
+ "step": 26200
2049
+ },
2050
+ {
2051
+ "epoch": 38.85,
2052
+ "learning_rate": 1.7811616954474097e-05,
2053
+ "loss": 1.9707,
2054
+ "step": 26300
2055
+ },
2056
+ {
2057
+ "epoch": 39.0,
2058
+ "learning_rate": 1.7576138147566716e-05,
2059
+ "loss": 1.9477,
2060
+ "step": 26400
2061
+ },
2062
+ {
2063
+ "epoch": 39.14,
2064
+ "learning_rate": 1.7340659340659337e-05,
2065
+ "loss": 1.9524,
2066
+ "step": 26500
2067
+ },
2068
+ {
2069
+ "epoch": 39.14,
2070
+ "eval_loss": 0.927354097366333,
2071
+ "eval_runtime": 115.8614,
2072
+ "eval_samples_per_second": 17.443,
2073
+ "eval_steps_per_second": 2.184,
2074
+ "eval_wer": 2.0697674418604652,
2075
+ "step": 26500
2076
+ },
2077
+ {
2078
+ "epoch": 39.29,
2079
+ "learning_rate": 1.7105180533751963e-05,
2080
+ "loss": 1.9673,
2081
+ "step": 26600
2082
+ },
2083
+ {
2084
+ "epoch": 39.44,
2085
+ "learning_rate": 1.6869701726844584e-05,
2086
+ "loss": 1.9802,
2087
+ "step": 26700
2088
+ },
2089
+ {
2090
+ "epoch": 39.59,
2091
+ "learning_rate": 1.6634222919937203e-05,
2092
+ "loss": 1.9408,
2093
+ "step": 26800
2094
+ },
2095
+ {
2096
+ "epoch": 39.73,
2097
+ "learning_rate": 1.6398744113029824e-05,
2098
+ "loss": 1.9482,
2099
+ "step": 26900
2100
+ },
2101
+ {
2102
+ "epoch": 39.88,
2103
+ "learning_rate": 1.6163265306122446e-05,
2104
+ "loss": 1.9427,
2105
+ "step": 27000
2106
+ },
2107
+ {
2108
+ "epoch": 39.88,
2109
+ "eval_loss": 0.9232719540596008,
2110
+ "eval_runtime": 116.3191,
2111
+ "eval_samples_per_second": 17.375,
2112
+ "eval_steps_per_second": 2.175,
2113
+ "eval_wer": 2.078673923800099,
2114
+ "step": 27000
2115
+ },
2116
+ {
2117
+ "epoch": 40.03,
2118
+ "learning_rate": 1.592778649921507e-05,
2119
+ "loss": 1.9653,
2120
+ "step": 27100
2121
+ },
2122
+ {
2123
+ "epoch": 40.18,
2124
+ "learning_rate": 1.569230769230769e-05,
2125
+ "loss": 1.9157,
2126
+ "step": 27200
2127
+ },
2128
+ {
2129
+ "epoch": 40.32,
2130
+ "learning_rate": 1.545682888540031e-05,
2131
+ "loss": 1.9493,
2132
+ "step": 27300
2133
+ },
2134
+ {
2135
+ "epoch": 40.47,
2136
+ "learning_rate": 1.5221350078492935e-05,
2137
+ "loss": 1.8974,
2138
+ "step": 27400
2139
+ },
2140
+ {
2141
+ "epoch": 40.62,
2142
+ "learning_rate": 1.4985871271585557e-05,
2143
+ "loss": 1.9258,
2144
+ "step": 27500
2145
+ },
2146
+ {
2147
+ "epoch": 40.62,
2148
+ "eval_loss": 0.9182448983192444,
2149
+ "eval_runtime": 115.4065,
2150
+ "eval_samples_per_second": 17.512,
2151
+ "eval_steps_per_second": 2.192,
2152
+ "eval_wer": 2.052944087085601,
2153
+ "step": 27500
2154
+ },
2155
+ {
2156
+ "epoch": 40.77,
2157
+ "learning_rate": 1.4750392464678177e-05,
2158
+ "loss": 1.9354,
2159
+ "step": 27600
2160
+ },
2161
+ {
2162
+ "epoch": 40.92,
2163
+ "learning_rate": 1.4514913657770799e-05,
2164
+ "loss": 1.952,
2165
+ "step": 27700
2166
+ },
2167
+ {
2168
+ "epoch": 41.06,
2169
+ "learning_rate": 1.4281789638932496e-05,
2170
+ "loss": 1.9231,
2171
+ "step": 27800
2172
+ },
2173
+ {
2174
+ "epoch": 41.21,
2175
+ "learning_rate": 1.4046310832025116e-05,
2176
+ "loss": 1.9465,
2177
+ "step": 27900
2178
+ },
2179
+ {
2180
+ "epoch": 41.36,
2181
+ "learning_rate": 1.3810832025117738e-05,
2182
+ "loss": 1.9031,
2183
+ "step": 28000
2184
+ },
2185
+ {
2186
+ "epoch": 41.36,
2187
+ "eval_loss": 0.9149593114852905,
2188
+ "eval_runtime": 116.2555,
2189
+ "eval_samples_per_second": 17.384,
2190
+ "eval_steps_per_second": 2.176,
2191
+ "eval_wer": 2.078673923800099,
2192
+ "step": 28000
2193
+ },
2194
+ {
2195
+ "epoch": 41.51,
2196
+ "learning_rate": 1.357535321821036e-05,
2197
+ "loss": 1.9361,
2198
+ "step": 28100
2199
+ },
2200
+ {
2201
+ "epoch": 41.65,
2202
+ "learning_rate": 1.3342229199372054e-05,
2203
+ "loss": 1.916,
2204
+ "step": 28200
2205
+ },
2206
+ {
2207
+ "epoch": 41.8,
2208
+ "learning_rate": 1.3106750392464677e-05,
2209
+ "loss": 1.9149,
2210
+ "step": 28300
2211
+ },
2212
+ {
2213
+ "epoch": 41.95,
2214
+ "learning_rate": 1.2871271585557299e-05,
2215
+ "loss": 1.9037,
2216
+ "step": 28400
2217
+ },
2218
+ {
2219
+ "epoch": 42.1,
2220
+ "learning_rate": 1.263579277864992e-05,
2221
+ "loss": 1.9297,
2222
+ "step": 28500
2223
+ },
2224
+ {
2225
+ "epoch": 42.1,
2226
+ "eval_loss": 0.9040070176124573,
2227
+ "eval_runtime": 113.8901,
2228
+ "eval_samples_per_second": 17.745,
2229
+ "eval_steps_per_second": 2.221,
2230
+ "eval_wer": 2.0504700643245917,
2231
+ "step": 28500
2232
+ },
2233
+ {
2234
+ "epoch": 42.25,
2235
+ "learning_rate": 1.2400313971742541e-05,
2236
+ "loss": 1.8855,
2237
+ "step": 28600
2238
+ },
2239
+ {
2240
+ "epoch": 42.39,
2241
+ "learning_rate": 1.2164835164835163e-05,
2242
+ "loss": 1.9095,
2243
+ "step": 28700
2244
+ },
2245
+ {
2246
+ "epoch": 42.54,
2247
+ "learning_rate": 1.1929356357927786e-05,
2248
+ "loss": 1.8913,
2249
+ "step": 28800
2250
+ },
2251
+ {
2252
+ "epoch": 42.69,
2253
+ "learning_rate": 1.1693877551020408e-05,
2254
+ "loss": 1.8685,
2255
+ "step": 28900
2256
+ },
2257
+ {
2258
+ "epoch": 42.84,
2259
+ "learning_rate": 1.1458398744113028e-05,
2260
+ "loss": 1.9041,
2261
+ "step": 29000
2262
+ },
2263
+ {
2264
+ "epoch": 42.84,
2265
+ "eval_loss": 0.9008907675743103,
2266
+ "eval_runtime": 114.9643,
2267
+ "eval_samples_per_second": 17.579,
2268
+ "eval_steps_per_second": 2.201,
2269
+ "eval_wer": 2.05789213260762,
2270
+ "step": 29000
2271
+ },
2272
+ {
2273
+ "epoch": 42.98,
2274
+ "learning_rate": 1.122291993720565e-05,
2275
+ "loss": 1.8963,
2276
+ "step": 29100
2277
+ },
2278
+ {
2279
+ "epoch": 43.13,
2280
+ "learning_rate": 1.0987441130298273e-05,
2281
+ "loss": 1.9068,
2282
+ "step": 29200
2283
+ },
2284
+ {
2285
+ "epoch": 43.28,
2286
+ "learning_rate": 1.0751962323390895e-05,
2287
+ "loss": 1.9003,
2288
+ "step": 29300
2289
+ },
2290
+ {
2291
+ "epoch": 43.43,
2292
+ "learning_rate": 1.0516483516483515e-05,
2293
+ "loss": 1.891,
2294
+ "step": 29400
2295
+ },
2296
+ {
2297
+ "epoch": 43.57,
2298
+ "learning_rate": 1.0281004709576137e-05,
2299
+ "loss": 1.8929,
2300
+ "step": 29500
2301
+ },
2302
+ {
2303
+ "epoch": 43.57,
2304
+ "eval_loss": 0.8968304991722107,
2305
+ "eval_runtime": 116.4378,
2306
+ "eval_samples_per_second": 17.357,
2307
+ "eval_steps_per_second": 2.173,
2308
+ "eval_wer": 2.032657100445324,
2309
+ "step": 29500
2310
+ },
2311
+ {
2312
+ "epoch": 43.72,
2313
+ "learning_rate": 1.0045525902668759e-05,
2314
+ "loss": 1.8827,
2315
+ "step": 29600
2316
+ },
2317
+ {
2318
+ "epoch": 43.87,
2319
+ "learning_rate": 9.810047095761382e-06,
2320
+ "loss": 1.8862,
2321
+ "step": 29700
2322
+ },
2323
+ {
2324
+ "epoch": 44.02,
2325
+ "learning_rate": 9.574568288854002e-06,
2326
+ "loss": 1.8787,
2327
+ "step": 29800
2328
+ },
2329
+ {
2330
+ "epoch": 44.17,
2331
+ "learning_rate": 9.339089481946624e-06,
2332
+ "loss": 1.8501,
2333
+ "step": 29900
2334
+ },
2335
+ {
2336
+ "epoch": 44.31,
2337
+ "learning_rate": 9.103610675039246e-06,
2338
+ "loss": 1.9077,
2339
+ "step": 30000
2340
+ },
2341
+ {
2342
+ "epoch": 44.31,
2343
+ "eval_loss": 0.8953686952590942,
2344
+ "eval_runtime": 115.4838,
2345
+ "eval_samples_per_second": 17.5,
2346
+ "eval_steps_per_second": 2.191,
2347
+ "eval_wer": 2.061850569025235,
2348
+ "step": 30000
2349
+ },
2350
+ {
2351
+ "epoch": 44.46,
2352
+ "learning_rate": 8.868131868131868e-06,
2353
+ "loss": 1.8804,
2354
+ "step": 30100
2355
+ },
2356
+ {
2357
+ "epoch": 44.61,
2358
+ "learning_rate": 8.63265306122449e-06,
2359
+ "loss": 1.8723,
2360
+ "step": 30200
2361
+ },
2362
+ {
2363
+ "epoch": 44.76,
2364
+ "learning_rate": 8.397174254317111e-06,
2365
+ "loss": 1.8577,
2366
+ "step": 30300
2367
+ },
2368
+ {
2369
+ "epoch": 44.9,
2370
+ "learning_rate": 8.161695447409733e-06,
2371
+ "loss": 1.8811,
2372
+ "step": 30400
2373
+ },
2374
+ {
2375
+ "epoch": 45.05,
2376
+ "learning_rate": 7.928571428571429e-06,
2377
+ "loss": 1.8504,
2378
+ "step": 30500
2379
+ },
2380
+ {
2381
+ "epoch": 45.05,
2382
+ "eval_loss": 0.892192542552948,
2383
+ "eval_runtime": 116.2513,
2384
+ "eval_samples_per_second": 17.385,
2385
+ "eval_steps_per_second": 2.176,
2386
+ "eval_wer": 2.07372587827808,
2387
+ "step": 30500
2388
+ },
2389
+ {
2390
+ "epoch": 45.2,
2391
+ "learning_rate": 7.693092621664049e-06,
2392
+ "loss": 1.861,
2393
+ "step": 30600
2394
+ },
2395
+ {
2396
+ "epoch": 45.35,
2397
+ "learning_rate": 7.457613814756671e-06,
2398
+ "loss": 1.8496,
2399
+ "step": 30700
2400
+ },
2401
+ {
2402
+ "epoch": 45.49,
2403
+ "learning_rate": 7.222135007849293e-06,
2404
+ "loss": 1.8612,
2405
+ "step": 30800
2406
+ },
2407
+ {
2408
+ "epoch": 45.64,
2409
+ "learning_rate": 6.986656200941915e-06,
2410
+ "loss": 1.865,
2411
+ "step": 30900
2412
+ },
2413
+ {
2414
+ "epoch": 45.79,
2415
+ "learning_rate": 6.751177394034536e-06,
2416
+ "loss": 1.8732,
2417
+ "step": 31000
2418
+ },
2419
+ {
2420
+ "epoch": 45.79,
2421
+ "eval_loss": 0.8897548317909241,
2422
+ "eval_runtime": 116.5927,
2423
+ "eval_samples_per_second": 17.334,
2424
+ "eval_steps_per_second": 2.17,
2425
+ "eval_wer": 2.0682830282038593,
2426
+ "step": 31000
2427
+ },
2428
+ {
2429
+ "epoch": 45.94,
2430
+ "learning_rate": 6.5156985871271585e-06,
2431
+ "loss": 1.8374,
2432
+ "step": 31100
2433
+ },
2434
+ {
2435
+ "epoch": 46.09,
2436
+ "learning_rate": 6.280219780219779e-06,
2437
+ "loss": 1.8395,
2438
+ "step": 31200
2439
+ },
2440
+ {
2441
+ "epoch": 46.23,
2442
+ "learning_rate": 6.044740973312402e-06,
2443
+ "loss": 1.8377,
2444
+ "step": 31300
2445
+ },
2446
+ {
2447
+ "epoch": 46.38,
2448
+ "learning_rate": 5.809262166405023e-06,
2449
+ "loss": 1.87,
2450
+ "step": 31400
2451
+ },
2452
+ {
2453
+ "epoch": 46.53,
2454
+ "learning_rate": 5.573783359497644e-06,
2455
+ "loss": 1.877,
2456
+ "step": 31500
2457
+ },
2458
+ {
2459
+ "epoch": 46.53,
2460
+ "eval_loss": 0.8848925828933716,
2461
+ "eval_runtime": 116.1465,
2462
+ "eval_samples_per_second": 17.4,
2463
+ "eval_steps_per_second": 2.178,
2464
+ "eval_wer": 2.0588817417120238,
2465
+ "step": 31500
2466
+ },
2467
+ {
2468
+ "epoch": 46.68,
2469
+ "learning_rate": 5.3383045525902665e-06,
2470
+ "loss": 1.8256,
2471
+ "step": 31600
2472
+ },
2473
+ {
2474
+ "epoch": 46.82,
2475
+ "learning_rate": 5.1028257456828875e-06,
2476
+ "loss": 1.8317,
2477
+ "step": 31700
2478
+ },
2479
+ {
2480
+ "epoch": 46.97,
2481
+ "learning_rate": 4.86734693877551e-06,
2482
+ "loss": 1.8579,
2483
+ "step": 31800
2484
+ },
2485
+ {
2486
+ "epoch": 47.12,
2487
+ "learning_rate": 4.631868131868132e-06,
2488
+ "loss": 1.839,
2489
+ "step": 31900
2490
+ },
2491
+ {
2492
+ "epoch": 47.27,
2493
+ "learning_rate": 4.396389324960754e-06,
2494
+ "loss": 1.8587,
2495
+ "step": 32000
2496
+ },
2497
+ {
2498
+ "epoch": 47.27,
2499
+ "eval_loss": 0.8843359351158142,
2500
+ "eval_runtime": 116.5866,
2501
+ "eval_samples_per_second": 17.335,
2502
+ "eval_steps_per_second": 2.17,
2503
+ "eval_wer": 2.045027214250371,
2504
+ "step": 32000
2505
+ },
2506
+ {
2507
+ "epoch": 47.41,
2508
+ "learning_rate": 4.160910518053375e-06,
2509
+ "loss": 1.8419,
2510
+ "step": 32100
2511
+ },
2512
+ {
2513
+ "epoch": 47.56,
2514
+ "learning_rate": 3.925431711145996e-06,
2515
+ "loss": 1.8639,
2516
+ "step": 32200
2517
+ },
2518
+ {
2519
+ "epoch": 47.71,
2520
+ "learning_rate": 3.6899529042386186e-06,
2521
+ "loss": 1.8395,
2522
+ "step": 32300
2523
+ },
2524
+ {
2525
+ "epoch": 47.86,
2526
+ "learning_rate": 3.45447409733124e-06,
2527
+ "loss": 1.8369,
2528
+ "step": 32400
2529
+ },
2530
+ {
2531
+ "epoch": 48.01,
2532
+ "learning_rate": 3.2189952904238617e-06,
2533
+ "loss": 1.8236,
2534
+ "step": 32500
2535
+ },
2536
+ {
2537
+ "epoch": 48.01,
2538
+ "eval_loss": 0.8810222148895264,
2539
+ "eval_runtime": 115.817,
2540
+ "eval_samples_per_second": 17.45,
2541
+ "eval_steps_per_second": 2.184,
2542
+ "eval_wer": 2.0554181098466104,
2543
+ "step": 32500
2544
+ },
2545
+ {
2546
+ "epoch": 48.15,
2547
+ "learning_rate": 2.9835164835164835e-06,
2548
+ "loss": 1.8468,
2549
+ "step": 32600
2550
+ },
2551
+ {
2552
+ "epoch": 48.3,
2553
+ "learning_rate": 2.7503924646781788e-06,
2554
+ "loss": 1.8326,
2555
+ "step": 32700
2556
+ },
2557
+ {
2558
+ "epoch": 48.45,
2559
+ "learning_rate": 2.5149136577708006e-06,
2560
+ "loss": 1.8279,
2561
+ "step": 32800
2562
+ },
2563
+ {
2564
+ "epoch": 48.6,
2565
+ "learning_rate": 2.2794348508634223e-06,
2566
+ "loss": 1.8324,
2567
+ "step": 32900
2568
+ },
2569
+ {
2570
+ "epoch": 48.74,
2571
+ "learning_rate": 2.043956043956044e-06,
2572
+ "loss": 1.8392,
2573
+ "step": 33000
2574
+ },
2575
+ {
2576
+ "epoch": 48.74,
2577
+ "eval_loss": 0.8820456266403198,
2578
+ "eval_runtime": 115.6891,
2579
+ "eval_samples_per_second": 17.469,
2580
+ "eval_steps_per_second": 2.187,
2581
+ "eval_wer": 2.0573973280554183,
2582
+ "step": 33000
2583
+ },
2584
+ {
2585
+ "epoch": 48.89,
2586
+ "learning_rate": 1.8084772370486653e-06,
2587
+ "loss": 1.8363,
2588
+ "step": 33100
2589
+ },
2590
+ {
2591
+ "epoch": 49.04,
2592
+ "learning_rate": 1.572998430141287e-06,
2593
+ "loss": 1.7996,
2594
+ "step": 33200
2595
+ },
2596
+ {
2597
+ "epoch": 49.19,
2598
+ "learning_rate": 1.3375196232339088e-06,
2599
+ "loss": 1.8113,
2600
+ "step": 33300
2601
+ },
2602
+ {
2603
+ "epoch": 49.34,
2604
+ "learning_rate": 1.1020408163265304e-06,
2605
+ "loss": 1.8428,
2606
+ "step": 33400
2607
+ },
2608
+ {
2609
+ "epoch": 49.48,
2610
+ "learning_rate": 8.665620094191522e-07,
2611
+ "loss": 1.8428,
2612
+ "step": 33500
2613
+ },
2614
+ {
2615
+ "epoch": 49.48,
2616
+ "eval_loss": 0.8815611600875854,
2617
+ "eval_runtime": 117.0058,
2618
+ "eval_samples_per_second": 17.273,
2619
+ "eval_steps_per_second": 2.162,
2620
+ "eval_wer": 2.066798614547254,
2621
+ "step": 33500
2622
+ }
2623
+ ],
2624
+ "max_steps": 33850,
2625
+ "num_train_epochs": 50,
2626
+ "total_flos": 1.524607779428202e+20,
2627
+ "trial_name": null,
2628
+ "trial_params": null
2629
+ }
checkpoint-33500/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:865e00e6503ba9a84aae151f6fca7048aa8118080935253305e52fcf3ebbf980
3
+ size 3055
config.json ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
+ "activation_dropout": 0.1,
4
+ "adapter_kernel_size": 3,
5
+ "adapter_stride": 2,
6
+ "add_adapter": false,
7
+ "apply_spec_augment": true,
8
+ "architectures": [
9
+ "Wav2Vec2ForCTC"
10
+ ],
11
+ "attention_dropout": 0.0,
12
+ "bos_token_id": 1,
13
+ "classifier_proj_size": 256,
14
+ "codevector_dim": 768,
15
+ "contrastive_logits_temperature": 0.1,
16
+ "conv_bias": true,
17
+ "conv_dim": [
18
+ 512,
19
+ 512,
20
+ 512,
21
+ 512,
22
+ 512,
23
+ 512,
24
+ 512
25
+ ],
26
+ "conv_kernel": [
27
+ 10,
28
+ 3,
29
+ 3,
30
+ 3,
31
+ 3,
32
+ 2,
33
+ 2
34
+ ],
35
+ "conv_stride": [
36
+ 5,
37
+ 2,
38
+ 2,
39
+ 2,
40
+ 2,
41
+ 2,
42
+ 2
43
+ ],
44
+ "ctc_loss_reduction": "mean",
45
+ "ctc_zero_infinity": false,
46
+ "diversity_loss_weight": 0.1,
47
+ "do_stable_layer_norm": true,
48
+ "eos_token_id": 2,
49
+ "feat_extract_activation": "gelu",
50
+ "feat_extract_dropout": 0.0,
51
+ "feat_extract_norm": "layer",
52
+ "feat_proj_dropout": 0.0,
53
+ "feat_quantizer_dropout": 0.0,
54
+ "final_dropout": 0.0,
55
+ "hidden_act": "gelu",
56
+ "hidden_dropout": 0.0,
57
+ "hidden_size": 1024,
58
+ "initializer_range": 0.02,
59
+ "intermediate_size": 4096,
60
+ "layer_norm_eps": 1e-05,
61
+ "layerdrop": 0.0,
62
+ "mask_feature_length": 64,
63
+ "mask_feature_min_masks": 0,
64
+ "mask_feature_prob": 0.25,
65
+ "mask_time_length": 10,
66
+ "mask_time_min_masks": 2,
67
+ "mask_time_prob": 0.75,
68
+ "model_type": "wav2vec2",
69
+ "num_adapter_layers": 3,
70
+ "num_attention_heads": 16,
71
+ "num_codevector_groups": 2,
72
+ "num_codevectors_per_group": 320,
73
+ "num_conv_pos_embedding_groups": 16,
74
+ "num_conv_pos_embeddings": 128,
75
+ "num_feat_extract_layers": 7,
76
+ "num_hidden_layers": 24,
77
+ "num_negatives": 100,
78
+ "output_hidden_size": 1024,
79
+ "pad_token_id": 4649,
80
+ "proj_codevector_dim": 768,
81
+ "tdnn_dilation": [
82
+ 1,
83
+ 2,
84
+ 3,
85
+ 1,
86
+ 1
87
+ ],
88
+ "tdnn_dim": [
89
+ 512,
90
+ 512,
91
+ 512,
92
+ 512,
93
+ 1500
94
+ ],
95
+ "tdnn_kernel": [
96
+ 5,
97
+ 3,
98
+ 3,
99
+ 1,
100
+ 1
101
+ ],
102
+ "torch_dtype": "float32",
103
+ "transformers_version": "4.17.0.dev0",
104
+ "use_weighted_layer_sum": false,
105
+ "vocab_size": 4652,
106
+ "xvector_output_dim": 512
107
+ }
eval_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 50.0,
3
+ "eval_loss": 0.8828312158584595,
4
+ "eval_runtime": 118.6394,
5
+ "eval_samples": 2021,
6
+ "eval_samples_per_second": 17.035,
7
+ "eval_steps_per_second": 2.133,
8
+ "eval_wer": 2.060366155368629
9
+ }
preprocessor_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_normalize": true,
3
+ "feature_extractor_type": "Wav2Vec2FeatureExtractor",
4
+ "feature_size": 1,
5
+ "padding_side": "right",
6
+ "padding_value": 0,
7
+ "return_attention_mask": true,
8
+ "sampling_rate": 16000
9
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:52cae5ca6764c14d42c018ee3aaf4fb91e744d29168ff9ab649360aeff2859e2
3
+ size 1280996913
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
1
+ {"bos_token": "<s>", "eos_token": "</s>", "unk_token": "[UNK]", "pad_token": "[PAD]", "additional_special_tokens": [{"content": "<s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, {"content": "</s>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}]}
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
1
+ {"unk_token": "[UNK]", "bos_token": "<s>", "eos_token": "</s>", "pad_token": "[PAD]", "do_lower_case": false, "word_delimiter_token": "|", "special_tokens_map_file": null, "tokenizer_file": null, "name_or_path": "./wav2vec2-xls-r-300m-zh-CN", "tokenizer_class": "Wav2Vec2CTCTokenizer"}
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 50.0,
3
+ "train_loss": 4.34445102640938,
4
+ "train_runtime": 69888.62,
5
+ "train_samples": 21672,
6
+ "train_samples_per_second": 15.505,
7
+ "train_steps_per_second": 0.484
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,2656 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 49.99963086009598,
5
+ "global_step": 33850,
6
+ "is_hyper_param_search": false,
7
+ "is_local_process_zero": true,
8
+ "is_world_process_zero": true,
9
+ "log_history": [
10
+ {
11
+ "epoch": 0.15,
12
+ "learning_rate": 3.6375e-06,
13
+ "loss": 124.9665,
14
+ "step": 100
15
+ },
16
+ {
17
+ "epoch": 0.3,
18
+ "learning_rate": 7.3875e-06,
19
+ "loss": 92.673,
20
+ "step": 200
21
+ },
22
+ {
23
+ "epoch": 0.44,
24
+ "learning_rate": 1.1099999999999999e-05,
25
+ "loss": 74.8932,
26
+ "step": 300
27
+ },
28
+ {
29
+ "epoch": 0.59,
30
+ "learning_rate": 1.485e-05,
31
+ "loss": 68.0432,
32
+ "step": 400
33
+ },
34
+ {
35
+ "epoch": 0.74,
36
+ "learning_rate": 1.8599999999999998e-05,
37
+ "loss": 60.2112,
38
+ "step": 500
39
+ },
40
+ {
41
+ "epoch": 0.74,
42
+ "eval_loss": 64.81886291503906,
43
+ "eval_runtime": 129.9516,
44
+ "eval_samples_per_second": 15.552,
45
+ "eval_steps_per_second": 1.947,
46
+ "eval_wer": 1.0,
47
+ "step": 500
48
+ },
49
+ {
50
+ "epoch": 0.89,
51
+ "learning_rate": 2.2349999999999998e-05,
52
+ "loss": 51.3096,
53
+ "step": 600
54
+ },
55
+ {
56
+ "epoch": 1.03,
57
+ "learning_rate": 2.6099999999999997e-05,
58
+ "loss": 39.1106,
59
+ "step": 700
60
+ },
61
+ {
62
+ "epoch": 1.18,
63
+ "learning_rate": 2.985e-05,
64
+ "loss": 26.6843,
65
+ "step": 800
66
+ },
67
+ {
68
+ "epoch": 1.33,
69
+ "learning_rate": 3.36e-05,
70
+ "loss": 14.7864,
71
+ "step": 900
72
+ },
73
+ {
74
+ "epoch": 1.48,
75
+ "learning_rate": 3.735e-05,
76
+ "loss": 8.1128,
77
+ "step": 1000
78
+ },
79
+ {
80
+ "epoch": 1.48,
81
+ "eval_loss": 6.899676322937012,
82
+ "eval_runtime": 115.5788,
83
+ "eval_samples_per_second": 17.486,
84
+ "eval_steps_per_second": 2.189,
85
+ "eval_wer": 1.0,
86
+ "step": 1000
87
+ },
88
+ {
89
+ "epoch": 1.62,
90
+ "learning_rate": 4.11e-05,
91
+ "loss": 6.6068,
92
+ "step": 1100
93
+ },
94
+ {
95
+ "epoch": 1.77,
96
+ "learning_rate": 4.484999999999999e-05,
97
+ "loss": 6.23,
98
+ "step": 1200
99
+ },
100
+ {
101
+ "epoch": 1.92,
102
+ "learning_rate": 4.8599999999999995e-05,
103
+ "loss": 6.0972,
104
+ "step": 1300
105
+ },
106
+ {
107
+ "epoch": 2.07,
108
+ "learning_rate": 5.234999999999999e-05,
109
+ "loss": 6.0595,
110
+ "step": 1400
111
+ },
112
+ {
113
+ "epoch": 2.22,
114
+ "learning_rate": 5.6099999999999995e-05,
115
+ "loss": 6.0492,
116
+ "step": 1500
117
+ },
118
+ {
119
+ "epoch": 2.22,
120
+ "eval_loss": 5.967654228210449,
121
+ "eval_runtime": 115.432,
122
+ "eval_samples_per_second": 17.508,
123
+ "eval_steps_per_second": 2.192,
124
+ "eval_wer": 1.949529935675408,
125
+ "step": 1500
126
+ },
127
+ {
128
+ "epoch": 2.36,
129
+ "learning_rate": 5.985e-05,
130
+ "loss": 6.0266,
131
+ "step": 1600
132
+ },
133
+ {
134
+ "epoch": 2.51,
135
+ "learning_rate": 6.359999999999999e-05,
136
+ "loss": 5.9902,
137
+ "step": 1700
138
+ },
139
+ {
140
+ "epoch": 2.66,
141
+ "learning_rate": 6.735e-05,
142
+ "loss": 5.9762,
143
+ "step": 1800
144
+ },
145
+ {
146
+ "epoch": 2.81,
147
+ "learning_rate": 7.11e-05,
148
+ "loss": 5.9491,
149
+ "step": 1900
150
+ },
151
+ {
152
+ "epoch": 2.95,
153
+ "learning_rate": 7.484999999999999e-05,
154
+ "loss": 5.9326,
155
+ "step": 2000
156
+ },
157
+ {
158
+ "epoch": 2.95,
159
+ "eval_loss": 5.884542942047119,
160
+ "eval_runtime": 114.597,
161
+ "eval_samples_per_second": 17.636,
162
+ "eval_steps_per_second": 2.208,
163
+ "eval_wer": 1.409203364670955,
164
+ "step": 2000
165
+ },
166
+ {
167
+ "epoch": 3.1,
168
+ "learning_rate": 7.477394034536891e-05,
169
+ "loss": 5.9356,
170
+ "step": 2100
171
+ },
172
+ {
173
+ "epoch": 3.25,
174
+ "learning_rate": 7.453846153846153e-05,
175
+ "loss": 5.8889,
176
+ "step": 2200
177
+ },
178
+ {
179
+ "epoch": 3.4,
180
+ "learning_rate": 7.430298273155415e-05,
181
+ "loss": 5.899,
182
+ "step": 2300
183
+ },
184
+ {
185
+ "epoch": 3.54,
186
+ "learning_rate": 7.406750392464678e-05,
187
+ "loss": 5.8824,
188
+ "step": 2400
189
+ },
190
+ {
191
+ "epoch": 3.69,
192
+ "learning_rate": 7.38320251177394e-05,
193
+ "loss": 5.8763,
194
+ "step": 2500
195
+ },
196
+ {
197
+ "epoch": 3.69,
198
+ "eval_loss": 5.846009731292725,
199
+ "eval_runtime": 117.5393,
200
+ "eval_samples_per_second": 17.194,
201
+ "eval_steps_per_second": 2.152,
202
+ "eval_wer": 1.6125680356259278,
203
+ "step": 2500
204
+ },
205
+ {
206
+ "epoch": 3.84,
207
+ "learning_rate": 7.359654631083201e-05,
208
+ "loss": 5.875,
209
+ "step": 2600
210
+ },
211
+ {
212
+ "epoch": 3.99,
213
+ "learning_rate": 7.336106750392464e-05,
214
+ "loss": 5.8671,
215
+ "step": 2700
216
+ },
217
+ {
218
+ "epoch": 4.14,
219
+ "learning_rate": 7.312558869701726e-05,
220
+ "loss": 5.8591,
221
+ "step": 2800
222
+ },
223
+ {
224
+ "epoch": 4.28,
225
+ "learning_rate": 7.289010989010989e-05,
226
+ "loss": 5.8226,
227
+ "step": 2900
228
+ },
229
+ {
230
+ "epoch": 4.43,
231
+ "learning_rate": 7.265463108320251e-05,
232
+ "loss": 5.7888,
233
+ "step": 3000
234
+ },
235
+ {
236
+ "epoch": 4.43,
237
+ "eval_loss": 5.75445032119751,
238
+ "eval_runtime": 114.1832,
239
+ "eval_samples_per_second": 17.7,
240
+ "eval_steps_per_second": 2.216,
241
+ "eval_wer": 2.2033646709549726,
242
+ "step": 3000
243
+ },
244
+ {
245
+ "epoch": 4.58,
246
+ "learning_rate": 7.241915227629513e-05,
247
+ "loss": 5.8041,
248
+ "step": 3100
249
+ },
250
+ {
251
+ "epoch": 4.73,
252
+ "learning_rate": 7.218367346938774e-05,
253
+ "loss": 5.8013,
254
+ "step": 3200
255
+ },
256
+ {
257
+ "epoch": 4.87,
258
+ "learning_rate": 7.194819466248037e-05,
259
+ "loss": 5.7947,
260
+ "step": 3300
261
+ },
262
+ {
263
+ "epoch": 5.02,
264
+ "learning_rate": 7.171271585557299e-05,
265
+ "loss": 5.7802,
266
+ "step": 3400
267
+ },
268
+ {
269
+ "epoch": 5.17,
270
+ "learning_rate": 7.147723704866562e-05,
271
+ "loss": 5.735,
272
+ "step": 3500
273
+ },
274
+ {
275
+ "epoch": 5.17,
276
+ "eval_loss": 5.677657604217529,
277
+ "eval_runtime": 115.6516,
278
+ "eval_samples_per_second": 17.475,
279
+ "eval_steps_per_second": 2.188,
280
+ "eval_wer": 2.334982681840673,
281
+ "step": 3500
282
+ },
283
+ {
284
+ "epoch": 5.32,
285
+ "learning_rate": 7.124175824175823e-05,
286
+ "loss": 5.7198,
287
+ "step": 3600
288
+ },
289
+ {
290
+ "epoch": 5.47,
291
+ "learning_rate": 7.100627943485086e-05,
292
+ "loss": 5.7092,
293
+ "step": 3700
294
+ },
295
+ {
296
+ "epoch": 5.61,
297
+ "learning_rate": 7.077080062794347e-05,
298
+ "loss": 5.6613,
299
+ "step": 3800
300
+ },
301
+ {
302
+ "epoch": 5.76,
303
+ "learning_rate": 7.05353218210361e-05,
304
+ "loss": 5.6579,
305
+ "step": 3900
306
+ },
307
+ {
308
+ "epoch": 5.91,
309
+ "learning_rate": 7.029984301412873e-05,
310
+ "loss": 5.6861,
311
+ "step": 4000
312
+ },
313
+ {
314
+ "epoch": 5.91,
315
+ "eval_loss": 5.517865180969238,
316
+ "eval_runtime": 115.3653,
317
+ "eval_samples_per_second": 17.518,
318
+ "eval_steps_per_second": 2.193,
319
+ "eval_wer": 2.223156853043048,
320
+ "step": 4000
321
+ },
322
+ {
323
+ "epoch": 6.06,
324
+ "learning_rate": 7.006436420722135e-05,
325
+ "loss": 5.6024,
326
+ "step": 4100
327
+ },
328
+ {
329
+ "epoch": 6.2,
330
+ "learning_rate": 6.982888540031396e-05,
331
+ "loss": 5.5497,
332
+ "step": 4200
333
+ },
334
+ {
335
+ "epoch": 6.35,
336
+ "learning_rate": 6.959340659340659e-05,
337
+ "loss": 5.5257,
338
+ "step": 4300
339
+ },
340
+ {
341
+ "epoch": 6.5,
342
+ "learning_rate": 6.93579277864992e-05,
343
+ "loss": 5.4534,
344
+ "step": 4400
345
+ },
346
+ {
347
+ "epoch": 6.65,
348
+ "learning_rate": 6.912244897959182e-05,
349
+ "loss": 5.381,
350
+ "step": 4500
351
+ },
352
+ {
353
+ "epoch": 6.65,
354
+ "eval_loss": 5.142032146453857,
355
+ "eval_runtime": 117.6237,
356
+ "eval_samples_per_second": 17.182,
357
+ "eval_steps_per_second": 2.151,
358
+ "eval_wer": 2.18159327065809,
359
+ "step": 4500
360
+ },
361
+ {
362
+ "epoch": 6.79,
363
+ "learning_rate": 6.888697017268445e-05,
364
+ "loss": 5.3409,
365
+ "step": 4600
366
+ },
367
+ {
368
+ "epoch": 6.94,
369
+ "learning_rate": 6.865149136577708e-05,
370
+ "loss": 5.1283,
371
+ "step": 4700
372
+ },
373
+ {
374
+ "epoch": 7.09,
375
+ "learning_rate": 6.841601255886969e-05,
376
+ "loss": 4.8788,
377
+ "step": 4800
378
+ },
379
+ {
380
+ "epoch": 7.24,
381
+ "learning_rate": 6.818053375196232e-05,
382
+ "loss": 4.7235,
383
+ "step": 4900
384
+ },
385
+ {
386
+ "epoch": 7.39,
387
+ "learning_rate": 6.794505494505494e-05,
388
+ "loss": 4.625,
389
+ "step": 5000
390
+ },
391
+ {
392
+ "epoch": 7.39,
393
+ "eval_loss": 3.9019837379455566,
394
+ "eval_runtime": 116.0971,
395
+ "eval_samples_per_second": 17.408,
396
+ "eval_steps_per_second": 2.179,
397
+ "eval_wer": 2.0722414646214746,
398
+ "step": 5000
399
+ },
400
+ {
401
+ "epoch": 7.53,
402
+ "learning_rate": 6.770957613814756e-05,
403
+ "loss": 4.5404,
404
+ "step": 5100
405
+ },
406
+ {
407
+ "epoch": 7.68,
408
+ "learning_rate": 6.747409733124018e-05,
409
+ "loss": 4.4307,
410
+ "step": 5200
411
+ },
412
+ {
413
+ "epoch": 7.83,
414
+ "learning_rate": 6.723861852433281e-05,
415
+ "loss": 4.3794,
416
+ "step": 5300
417
+ },
418
+ {
419
+ "epoch": 7.98,
420
+ "learning_rate": 6.700313971742542e-05,
421
+ "loss": 4.2786,
422
+ "step": 5400
423
+ },
424
+ {
425
+ "epoch": 8.12,
426
+ "learning_rate": 6.676766091051805e-05,
427
+ "loss": 4.214,
428
+ "step": 5500
429
+ },
430
+ {
431
+ "epoch": 8.12,
432
+ "eval_loss": 3.339416027069092,
433
+ "eval_runtime": 116.9868,
434
+ "eval_samples_per_second": 17.275,
435
+ "eval_steps_per_second": 2.163,
436
+ "eval_wer": 2.1429985155863434,
437
+ "step": 5500
438
+ },
439
+ {
440
+ "epoch": 8.27,
441
+ "learning_rate": 6.653218210361068e-05,
442
+ "loss": 4.1206,
443
+ "step": 5600
444
+ },
445
+ {
446
+ "epoch": 8.42,
447
+ "learning_rate": 6.62967032967033e-05,
448
+ "loss": 4.081,
449
+ "step": 5700
450
+ },
451
+ {
452
+ "epoch": 8.57,
453
+ "learning_rate": 6.606122448979591e-05,
454
+ "loss": 4.0059,
455
+ "step": 5800
456
+ },
457
+ {
458
+ "epoch": 8.71,
459
+ "learning_rate": 6.582574568288854e-05,
460
+ "loss": 3.9251,
461
+ "step": 5900
462
+ },
463
+ {
464
+ "epoch": 8.86,
465
+ "learning_rate": 6.559262166405023e-05,
466
+ "loss": 3.8992,
467
+ "step": 6000
468
+ },
469
+ {
470
+ "epoch": 8.86,
471
+ "eval_loss": 2.9084665775299072,
472
+ "eval_runtime": 119.0907,
473
+ "eval_samples_per_second": 16.97,
474
+ "eval_steps_per_second": 2.124,
475
+ "eval_wer": 2.153389411182583,
476
+ "step": 6000
477
+ },
478
+ {
479
+ "epoch": 9.01,
480
+ "learning_rate": 6.535714285714285e-05,
481
+ "loss": 3.8494,
482
+ "step": 6100
483
+ },
484
+ {
485
+ "epoch": 9.16,
486
+ "learning_rate": 6.512166405023547e-05,
487
+ "loss": 3.7923,
488
+ "step": 6200
489
+ },
490
+ {
491
+ "epoch": 9.31,
492
+ "learning_rate": 6.48861852433281e-05,
493
+ "loss": 3.7416,
494
+ "step": 6300
495
+ },
496
+ {
497
+ "epoch": 9.45,
498
+ "learning_rate": 6.465070643642071e-05,
499
+ "loss": 3.7095,
500
+ "step": 6400
501
+ },
502
+ {
503
+ "epoch": 9.6,
504
+ "learning_rate": 6.441522762951334e-05,
505
+ "loss": 3.6481,
506
+ "step": 6500
507
+ },
508
+ {
509
+ "epoch": 9.6,
510
+ "eval_loss": 2.620758295059204,
511
+ "eval_runtime": 115.2407,
512
+ "eval_samples_per_second": 17.537,
513
+ "eval_steps_per_second": 2.195,
514
+ "eval_wer": 2.3537852548243445,
515
+ "step": 6500
516
+ },
517
+ {
518
+ "epoch": 9.75,
519
+ "learning_rate": 6.417974882260596e-05,
520
+ "loss": 3.6196,
521
+ "step": 6600
522
+ },
523
+ {
524
+ "epoch": 9.9,
525
+ "learning_rate": 6.394427001569859e-05,
526
+ "loss": 3.5941,
527
+ "step": 6700
528
+ },
529
+ {
530
+ "epoch": 10.04,
531
+ "learning_rate": 6.37087912087912e-05,
532
+ "loss": 3.5608,
533
+ "step": 6800
534
+ },
535
+ {
536
+ "epoch": 10.19,
537
+ "learning_rate": 6.347331240188383e-05,
538
+ "loss": 3.5296,
539
+ "step": 6900
540
+ },
541
+ {
542
+ "epoch": 10.34,
543
+ "learning_rate": 6.324018838304552e-05,
544
+ "loss": 3.4658,
545
+ "step": 7000
546
+ },
547
+ {
548
+ "epoch": 10.34,
549
+ "eval_loss": 2.3172152042388916,
550
+ "eval_runtime": 114.5436,
551
+ "eval_samples_per_second": 17.644,
552
+ "eval_steps_per_second": 2.209,
553
+ "eval_wer": 2.227115289460663,
554
+ "step": 7000
555
+ },
556
+ {
557
+ "epoch": 10.49,
558
+ "learning_rate": 6.300470957613814e-05,
559
+ "loss": 3.3977,
560
+ "step": 7100
561
+ },
562
+ {
563
+ "epoch": 10.63,
564
+ "learning_rate": 6.276923076923076e-05,
565
+ "loss": 3.3987,
566
+ "step": 7200
567
+ },
568
+ {
569
+ "epoch": 10.78,
570
+ "learning_rate": 6.253375196232339e-05,
571
+ "loss": 3.3587,
572
+ "step": 7300
573
+ },
574
+ {
575
+ "epoch": 10.93,
576
+ "learning_rate": 6.2298273155416e-05,
577
+ "loss": 3.2796,
578
+ "step": 7400
579
+ },
580
+ {
581
+ "epoch": 11.08,
582
+ "learning_rate": 6.206279434850863e-05,
583
+ "loss": 3.257,
584
+ "step": 7500
585
+ },
586
+ {
587
+ "epoch": 11.08,
588
+ "eval_loss": 2.0916049480438232,
589
+ "eval_runtime": 113.9408,
590
+ "eval_samples_per_second": 17.737,
591
+ "eval_steps_per_second": 2.22,
592
+ "eval_wer": 2.1350816427511132,
593
+ "step": 7500
594
+ },
595
+ {
596
+ "epoch": 11.23,
597
+ "learning_rate": 6.182731554160125e-05,
598
+ "loss": 3.2476,
599
+ "step": 7600
600
+ },
601
+ {
602
+ "epoch": 11.37,
603
+ "learning_rate": 6.159183673469388e-05,
604
+ "loss": 3.2463,
605
+ "step": 7700
606
+ },
607
+ {
608
+ "epoch": 11.52,
609
+ "learning_rate": 6.135635792778649e-05,
610
+ "loss": 3.2323,
611
+ "step": 7800
612
+ },
613
+ {
614
+ "epoch": 11.67,
615
+ "learning_rate": 6.112087912087912e-05,
616
+ "loss": 3.1674,
617
+ "step": 7900
618
+ },
619
+ {
620
+ "epoch": 11.82,
621
+ "learning_rate": 6.088540031397174e-05,
622
+ "loss": 3.1294,
623
+ "step": 8000
624
+ },
625
+ {
626
+ "epoch": 11.82,
627
+ "eval_loss": 1.895378828048706,
628
+ "eval_runtime": 115.1394,
629
+ "eval_samples_per_second": 17.553,
630
+ "eval_steps_per_second": 2.197,
631
+ "eval_wer": 2.2132607619990106,
632
+ "step": 8000
633
+ },
634
+ {
635
+ "epoch": 11.96,
636
+ "learning_rate": 6.0649921507064355e-05,
637
+ "loss": 3.1262,
638
+ "step": 8100
639
+ },
640
+ {
641
+ "epoch": 12.11,
642
+ "learning_rate": 6.041444270015698e-05,
643
+ "loss": 3.0377,
644
+ "step": 8200
645
+ },
646
+ {
647
+ "epoch": 12.26,
648
+ "learning_rate": 6.01789638932496e-05,
649
+ "loss": 3.0306,
650
+ "step": 8300
651
+ },
652
+ {
653
+ "epoch": 12.41,
654
+ "learning_rate": 5.994348508634223e-05,
655
+ "loss": 3.0425,
656
+ "step": 8400
657
+ },
658
+ {
659
+ "epoch": 12.56,
660
+ "learning_rate": 5.9710361067503915e-05,
661
+ "loss": 3.0266,
662
+ "step": 8500
663
+ },
664
+ {
665
+ "epoch": 12.56,
666
+ "eval_loss": 1.76727294921875,
667
+ "eval_runtime": 114.3494,
668
+ "eval_samples_per_second": 17.674,
669
+ "eval_steps_per_second": 2.213,
670
+ "eval_wer": 2.0895596239485403,
671
+ "step": 8500
672
+ },
673
+ {
674
+ "epoch": 12.7,
675
+ "learning_rate": 5.9474882260596537e-05,
676
+ "loss": 3.0398,
677
+ "step": 8600
678
+ },
679
+ {
680
+ "epoch": 12.85,
681
+ "learning_rate": 5.9239403453689165e-05,
682
+ "loss": 2.9985,
683
+ "step": 8700
684
+ },
685
+ {
686
+ "epoch": 13.0,
687
+ "learning_rate": 5.900392464678179e-05,
688
+ "loss": 2.9969,
689
+ "step": 8800
690
+ },
691
+ {
692
+ "epoch": 13.15,
693
+ "learning_rate": 5.876844583987441e-05,
694
+ "loss": 2.9648,
695
+ "step": 8900
696
+ },
697
+ {
698
+ "epoch": 13.29,
699
+ "learning_rate": 5.8532967032967024e-05,
700
+ "loss": 2.9451,
701
+ "step": 9000
702
+ },
703
+ {
704
+ "epoch": 13.29,
705
+ "eval_loss": 1.665855884552002,
706
+ "eval_runtime": 116.4877,
707
+ "eval_samples_per_second": 17.349,
708
+ "eval_steps_per_second": 2.172,
709
+ "eval_wer": 2.1380504700643246,
710
+ "step": 9000
711
+ },
712
+ {
713
+ "epoch": 13.44,
714
+ "learning_rate": 5.8297488226059645e-05,
715
+ "loss": 2.9573,
716
+ "step": 9100
717
+ },
718
+ {
719
+ "epoch": 13.59,
720
+ "learning_rate": 5.8062009419152274e-05,
721
+ "loss": 2.8819,
722
+ "step": 9200
723
+ },
724
+ {
725
+ "epoch": 13.74,
726
+ "learning_rate": 5.7826530612244896e-05,
727
+ "loss": 2.8901,
728
+ "step": 9300
729
+ },
730
+ {
731
+ "epoch": 13.88,
732
+ "learning_rate": 5.759105180533752e-05,
733
+ "loss": 2.8492,
734
+ "step": 9400
735
+ },
736
+ {
737
+ "epoch": 14.03,
738
+ "learning_rate": 5.735557299843013e-05,
739
+ "loss": 2.8802,
740
+ "step": 9500
741
+ },
742
+ {
743
+ "epoch": 14.03,
744
+ "eval_loss": 1.5637215375900269,
745
+ "eval_runtime": 115.1622,
746
+ "eval_samples_per_second": 17.549,
747
+ "eval_steps_per_second": 2.197,
748
+ "eval_wer": 2.1969322117763483,
749
+ "step": 9500
750
+ },
751
+ {
752
+ "epoch": 14.18,
753
+ "learning_rate": 5.7120094191522754e-05,
754
+ "loss": 2.8346,
755
+ "step": 9600
756
+ },
757
+ {
758
+ "epoch": 14.33,
759
+ "learning_rate": 5.6884615384615376e-05,
760
+ "loss": 2.8355,
761
+ "step": 9700
762
+ },
763
+ {
764
+ "epoch": 14.48,
765
+ "learning_rate": 5.6649136577708005e-05,
766
+ "loss": 2.8124,
767
+ "step": 9800
768
+ },
769
+ {
770
+ "epoch": 14.62,
771
+ "learning_rate": 5.6413657770800626e-05,
772
+ "loss": 2.7879,
773
+ "step": 9900
774
+ },
775
+ {
776
+ "epoch": 14.77,
777
+ "learning_rate": 5.617817896389324e-05,
778
+ "loss": 2.78,
779
+ "step": 10000
780
+ },
781
+ {
782
+ "epoch": 14.77,
783
+ "eval_loss": 1.4921427965164185,
784
+ "eval_runtime": 115.1,
785
+ "eval_samples_per_second": 17.559,
786
+ "eval_steps_per_second": 2.198,
787
+ "eval_wer": 2.2335477486392876,
788
+ "step": 10000
789
+ },
790
+ {
791
+ "epoch": 14.92,
792
+ "learning_rate": 5.594270015698586e-05,
793
+ "loss": 2.775,
794
+ "step": 10100
795
+ },
796
+ {
797
+ "epoch": 15.07,
798
+ "learning_rate": 5.5707221350078485e-05,
799
+ "loss": 2.7478,
800
+ "step": 10200
801
+ },
802
+ {
803
+ "epoch": 15.21,
804
+ "learning_rate": 5.5471742543171114e-05,
805
+ "loss": 2.7224,
806
+ "step": 10300
807
+ },
808
+ {
809
+ "epoch": 15.36,
810
+ "learning_rate": 5.5236263736263735e-05,
811
+ "loss": 2.7506,
812
+ "step": 10400
813
+ },
814
+ {
815
+ "epoch": 15.51,
816
+ "learning_rate": 5.500078492935635e-05,
817
+ "loss": 2.7049,
818
+ "step": 10500
819
+ },
820
+ {
821
+ "epoch": 15.51,
822
+ "eval_loss": 1.413183569908142,
823
+ "eval_runtime": 114.2743,
824
+ "eval_samples_per_second": 17.686,
825
+ "eval_steps_per_second": 2.214,
826
+ "eval_wer": 2.221672439386442,
827
+ "step": 10500
828
+ },
829
+ {
830
+ "epoch": 15.66,
831
+ "learning_rate": 5.476766091051805e-05,
832
+ "loss": 2.7145,
833
+ "step": 10600
834
+ },
835
+ {
836
+ "epoch": 15.8,
837
+ "learning_rate": 5.453218210361067e-05,
838
+ "loss": 2.6892,
839
+ "step": 10700
840
+ },
841
+ {
842
+ "epoch": 15.95,
843
+ "learning_rate": 5.4296703296703295e-05,
844
+ "loss": 2.69,
845
+ "step": 10800
846
+ },
847
+ {
848
+ "epoch": 16.1,
849
+ "learning_rate": 5.406122448979591e-05,
850
+ "loss": 2.623,
851
+ "step": 10900
852
+ },
853
+ {
854
+ "epoch": 16.25,
855
+ "learning_rate": 5.382574568288853e-05,
856
+ "loss": 2.6768,
857
+ "step": 11000
858
+ },
859
+ {
860
+ "epoch": 16.25,
861
+ "eval_loss": 1.3666878938674927,
862
+ "eval_runtime": 119.4402,
863
+ "eval_samples_per_second": 16.921,
864
+ "eval_steps_per_second": 2.118,
865
+ "eval_wer": 2.223156853043048,
866
+ "step": 11000
867
+ },
868
+ {
869
+ "epoch": 16.4,
870
+ "learning_rate": 5.359262166405023e-05,
871
+ "loss": 2.628,
872
+ "step": 11100
873
+ },
874
+ {
875
+ "epoch": 16.54,
876
+ "learning_rate": 5.3357142857142854e-05,
877
+ "loss": 2.6163,
878
+ "step": 11200
879
+ },
880
+ {
881
+ "epoch": 16.69,
882
+ "learning_rate": 5.312166405023547e-05,
883
+ "loss": 2.6193,
884
+ "step": 11300
885
+ },
886
+ {
887
+ "epoch": 16.84,
888
+ "learning_rate": 5.28861852433281e-05,
889
+ "loss": 2.6531,
890
+ "step": 11400
891
+ },
892
+ {
893
+ "epoch": 16.99,
894
+ "learning_rate": 5.265070643642072e-05,
895
+ "loss": 2.6358,
896
+ "step": 11500
897
+ },
898
+ {
899
+ "epoch": 16.99,
900
+ "eval_loss": 1.311090111732483,
901
+ "eval_runtime": 116.2157,
902
+ "eval_samples_per_second": 17.39,
903
+ "eval_steps_per_second": 2.177,
904
+ "eval_wer": 2.128649183572489,
905
+ "step": 11500
906
+ },
907
+ {
908
+ "epoch": 17.13,
909
+ "learning_rate": 5.241522762951334e-05,
910
+ "loss": 2.5748,
911
+ "step": 11600
912
+ },
913
+ {
914
+ "epoch": 17.28,
915
+ "learning_rate": 5.217974882260596e-05,
916
+ "loss": 2.6287,
917
+ "step": 11700
918
+ },
919
+ {
920
+ "epoch": 17.43,
921
+ "learning_rate": 5.194427001569858e-05,
922
+ "loss": 2.5583,
923
+ "step": 11800
924
+ },
925
+ {
926
+ "epoch": 17.58,
927
+ "learning_rate": 5.17087912087912e-05,
928
+ "loss": 2.5547,
929
+ "step": 11900
930
+ },
931
+ {
932
+ "epoch": 17.72,
933
+ "learning_rate": 5.147331240188383e-05,
934
+ "loss": 2.5802,
935
+ "step": 12000
936
+ },
937
+ {
938
+ "epoch": 17.72,
939
+ "eval_loss": 1.2678567171096802,
940
+ "eval_runtime": 116.076,
941
+ "eval_samples_per_second": 17.411,
942
+ "eval_steps_per_second": 2.18,
943
+ "eval_wer": 2.1429985155863434,
944
+ "step": 12000
945
+ },
946
+ {
947
+ "epoch": 17.87,
948
+ "learning_rate": 5.123783359497645e-05,
949
+ "loss": 2.557,
950
+ "step": 12100
951
+ },
952
+ {
953
+ "epoch": 18.02,
954
+ "learning_rate": 5.100235478806907e-05,
955
+ "loss": 2.5771,
956
+ "step": 12200
957
+ },
958
+ {
959
+ "epoch": 18.17,
960
+ "learning_rate": 5.076687598116169e-05,
961
+ "loss": 2.5393,
962
+ "step": 12300
963
+ },
964
+ {
965
+ "epoch": 18.32,
966
+ "learning_rate": 5.053375196232339e-05,
967
+ "loss": 2.5031,
968
+ "step": 12400
969
+ },
970
+ {
971
+ "epoch": 18.46,
972
+ "learning_rate": 5.029827315541601e-05,
973
+ "loss": 2.5012,
974
+ "step": 12500
975
+ },
976
+ {
977
+ "epoch": 18.46,
978
+ "eval_loss": 1.2365446090698242,
979
+ "eval_runtime": 116.0118,
980
+ "eval_samples_per_second": 17.421,
981
+ "eval_steps_per_second": 2.181,
982
+ "eval_wer": 2.115289460663038,
983
+ "step": 12500
984
+ },
985
+ {
986
+ "epoch": 18.61,
987
+ "learning_rate": 5.006279434850863e-05,
988
+ "loss": 2.54,
989
+ "step": 12600
990
+ },
991
+ {
992
+ "epoch": 18.76,
993
+ "learning_rate": 4.9827315541601246e-05,
994
+ "loss": 2.5072,
995
+ "step": 12700
996
+ },
997
+ {
998
+ "epoch": 18.91,
999
+ "learning_rate": 4.9591836734693875e-05,
1000
+ "loss": 2.4951,
1001
+ "step": 12800
1002
+ },
1003
+ {
1004
+ "epoch": 19.05,
1005
+ "learning_rate": 4.9356357927786497e-05,
1006
+ "loss": 2.4789,
1007
+ "step": 12900
1008
+ },
1009
+ {
1010
+ "epoch": 19.2,
1011
+ "learning_rate": 4.912087912087912e-05,
1012
+ "loss": 2.458,
1013
+ "step": 13000
1014
+ },
1015
+ {
1016
+ "epoch": 19.2,
1017
+ "eval_loss": 1.2117862701416016,
1018
+ "eval_runtime": 116.2579,
1019
+ "eval_samples_per_second": 17.384,
1020
+ "eval_steps_per_second": 2.176,
1021
+ "eval_wer": 2.1573478476001977,
1022
+ "step": 13000
1023
+ },
1024
+ {
1025
+ "epoch": 19.35,
1026
+ "learning_rate": 4.888540031397174e-05,
1027
+ "loss": 2.4616,
1028
+ "step": 13100
1029
+ },
1030
+ {
1031
+ "epoch": 19.5,
1032
+ "learning_rate": 4.8649921507064355e-05,
1033
+ "loss": 2.4739,
1034
+ "step": 13200
1035
+ },
1036
+ {
1037
+ "epoch": 19.65,
1038
+ "learning_rate": 4.8414442700156984e-05,
1039
+ "loss": 2.4867,
1040
+ "step": 13300
1041
+ },
1042
+ {
1043
+ "epoch": 19.79,
1044
+ "learning_rate": 4.8178963893249605e-05,
1045
+ "loss": 2.4568,
1046
+ "step": 13400
1047
+ },
1048
+ {
1049
+ "epoch": 19.94,
1050
+ "learning_rate": 4.794348508634223e-05,
1051
+ "loss": 2.4433,
1052
+ "step": 13500
1053
+ },
1054
+ {
1055
+ "epoch": 19.94,
1056
+ "eval_loss": 1.1991767883300781,
1057
+ "eval_runtime": 114.5641,
1058
+ "eval_samples_per_second": 17.641,
1059
+ "eval_steps_per_second": 2.208,
1060
+ "eval_wer": 2.1335972290945078,
1061
+ "step": 13500
1062
+ },
1063
+ {
1064
+ "epoch": 20.09,
1065
+ "learning_rate": 4.770800627943485e-05,
1066
+ "loss": 2.4532,
1067
+ "step": 13600
1068
+ },
1069
+ {
1070
+ "epoch": 20.24,
1071
+ "learning_rate": 4.7472527472527464e-05,
1072
+ "loss": 2.3913,
1073
+ "step": 13700
1074
+ },
1075
+ {
1076
+ "epoch": 20.38,
1077
+ "learning_rate": 4.7237048665620086e-05,
1078
+ "loss": 2.421,
1079
+ "step": 13800
1080
+ },
1081
+ {
1082
+ "epoch": 20.53,
1083
+ "learning_rate": 4.7001569858712714e-05,
1084
+ "loss": 2.4526,
1085
+ "step": 13900
1086
+ },
1087
+ {
1088
+ "epoch": 20.68,
1089
+ "learning_rate": 4.6766091051805336e-05,
1090
+ "loss": 2.438,
1091
+ "step": 14000
1092
+ },
1093
+ {
1094
+ "epoch": 20.68,
1095
+ "eval_loss": 1.180332064628601,
1096
+ "eval_runtime": 116.5012,
1097
+ "eval_samples_per_second": 17.347,
1098
+ "eval_steps_per_second": 2.172,
1099
+ "eval_wer": 2.1509153884215735,
1100
+ "step": 14000
1101
+ },
1102
+ {
1103
+ "epoch": 20.83,
1104
+ "learning_rate": 4.653061224489796e-05,
1105
+ "loss": 2.4034,
1106
+ "step": 14100
1107
+ },
1108
+ {
1109
+ "epoch": 20.97,
1110
+ "learning_rate": 4.629513343799057e-05,
1111
+ "loss": 2.4306,
1112
+ "step": 14200
1113
+ },
1114
+ {
1115
+ "epoch": 21.12,
1116
+ "learning_rate": 4.6059654631083195e-05,
1117
+ "loss": 2.4145,
1118
+ "step": 14300
1119
+ },
1120
+ {
1121
+ "epoch": 21.27,
1122
+ "learning_rate": 4.582417582417582e-05,
1123
+ "loss": 2.4677,
1124
+ "step": 14400
1125
+ },
1126
+ {
1127
+ "epoch": 21.42,
1128
+ "learning_rate": 4.5588697017268445e-05,
1129
+ "loss": 2.418,
1130
+ "step": 14500
1131
+ },
1132
+ {
1133
+ "epoch": 21.42,
1134
+ "eval_loss": 1.1601430177688599,
1135
+ "eval_runtime": 114.5652,
1136
+ "eval_samples_per_second": 17.641,
1137
+ "eval_steps_per_second": 2.208,
1138
+ "eval_wer": 2.1232063334982683,
1139
+ "step": 14500
1140
+ },
1141
+ {
1142
+ "epoch": 21.57,
1143
+ "learning_rate": 4.535321821036107e-05,
1144
+ "loss": 2.3967,
1145
+ "step": 14600
1146
+ },
1147
+ {
1148
+ "epoch": 21.71,
1149
+ "learning_rate": 4.511773940345368e-05,
1150
+ "loss": 2.3939,
1151
+ "step": 14700
1152
+ },
1153
+ {
1154
+ "epoch": 21.86,
1155
+ "learning_rate": 4.4882260596546304e-05,
1156
+ "loss": 2.3925,
1157
+ "step": 14800
1158
+ },
1159
+ {
1160
+ "epoch": 22.01,
1161
+ "learning_rate": 4.4646781789638925e-05,
1162
+ "loss": 2.3596,
1163
+ "step": 14900
1164
+ },
1165
+ {
1166
+ "epoch": 22.16,
1167
+ "learning_rate": 4.4411302982731554e-05,
1168
+ "loss": 2.3322,
1169
+ "step": 15000
1170
+ },
1171
+ {
1172
+ "epoch": 22.16,
1173
+ "eval_loss": 1.1417704820632935,
1174
+ "eval_runtime": 116.2111,
1175
+ "eval_samples_per_second": 17.391,
1176
+ "eval_steps_per_second": 2.177,
1177
+ "eval_wer": 2.1929737753587335,
1178
+ "step": 15000
1179
+ },
1180
+ {
1181
+ "epoch": 22.3,
1182
+ "learning_rate": 4.4175824175824176e-05,
1183
+ "loss": 2.3821,
1184
+ "step": 15100
1185
+ },
1186
+ {
1187
+ "epoch": 22.45,
1188
+ "learning_rate": 4.394034536891679e-05,
1189
+ "loss": 2.3435,
1190
+ "step": 15200
1191
+ },
1192
+ {
1193
+ "epoch": 22.6,
1194
+ "learning_rate": 4.370486656200941e-05,
1195
+ "loss": 2.3542,
1196
+ "step": 15300
1197
+ },
1198
+ {
1199
+ "epoch": 22.75,
1200
+ "learning_rate": 4.3469387755102034e-05,
1201
+ "loss": 2.3469,
1202
+ "step": 15400
1203
+ },
1204
+ {
1205
+ "epoch": 22.89,
1206
+ "learning_rate": 4.323390894819466e-05,
1207
+ "loss": 2.3387,
1208
+ "step": 15500
1209
+ },
1210
+ {
1211
+ "epoch": 22.89,
1212
+ "eval_loss": 1.1172302961349487,
1213
+ "eval_runtime": 114.3169,
1214
+ "eval_samples_per_second": 17.679,
1215
+ "eval_steps_per_second": 2.213,
1216
+ "eval_wer": 2.2464126669965365,
1217
+ "step": 15500
1218
+ },
1219
+ {
1220
+ "epoch": 23.04,
1221
+ "learning_rate": 4.2998430141287285e-05,
1222
+ "loss": 2.3688,
1223
+ "step": 15600
1224
+ },
1225
+ {
1226
+ "epoch": 23.19,
1227
+ "learning_rate": 4.27629513343799e-05,
1228
+ "loss": 2.3344,
1229
+ "step": 15700
1230
+ },
1231
+ {
1232
+ "epoch": 23.34,
1233
+ "learning_rate": 4.252747252747252e-05,
1234
+ "loss": 2.3245,
1235
+ "step": 15800
1236
+ },
1237
+ {
1238
+ "epoch": 23.49,
1239
+ "learning_rate": 4.229199372056514e-05,
1240
+ "loss": 2.3523,
1241
+ "step": 15900
1242
+ },
1243
+ {
1244
+ "epoch": 23.63,
1245
+ "learning_rate": 4.205651491365777e-05,
1246
+ "loss": 2.3349,
1247
+ "step": 16000
1248
+ },
1249
+ {
1250
+ "epoch": 23.63,
1251
+ "eval_loss": 1.1144375801086426,
1252
+ "eval_runtime": 116.2412,
1253
+ "eval_samples_per_second": 17.386,
1254
+ "eval_steps_per_second": 2.177,
1255
+ "eval_wer": 2.185551707075705,
1256
+ "step": 16000
1257
+ },
1258
+ {
1259
+ "epoch": 23.78,
1260
+ "learning_rate": 4.1821036106750393e-05,
1261
+ "loss": 2.2847,
1262
+ "step": 16100
1263
+ },
1264
+ {
1265
+ "epoch": 23.93,
1266
+ "learning_rate": 4.158555729984301e-05,
1267
+ "loss": 2.3303,
1268
+ "step": 16200
1269
+ },
1270
+ {
1271
+ "epoch": 24.08,
1272
+ "learning_rate": 4.135007849293563e-05,
1273
+ "loss": 2.2994,
1274
+ "step": 16300
1275
+ },
1276
+ {
1277
+ "epoch": 24.22,
1278
+ "learning_rate": 4.111459968602825e-05,
1279
+ "loss": 2.2887,
1280
+ "step": 16400
1281
+ },
1282
+ {
1283
+ "epoch": 24.37,
1284
+ "learning_rate": 4.0879120879120874e-05,
1285
+ "loss": 2.291,
1286
+ "step": 16500
1287
+ },
1288
+ {
1289
+ "epoch": 24.37,
1290
+ "eval_loss": 1.1018128395080566,
1291
+ "eval_runtime": 114.9042,
1292
+ "eval_samples_per_second": 17.589,
1293
+ "eval_steps_per_second": 2.202,
1294
+ "eval_wer": 2.1929737753587335,
1295
+ "step": 16500
1296
+ },
1297
+ {
1298
+ "epoch": 24.52,
1299
+ "learning_rate": 4.06436420722135e-05,
1300
+ "loss": 2.2888,
1301
+ "step": 16600
1302
+ },
1303
+ {
1304
+ "epoch": 24.67,
1305
+ "learning_rate": 4.040816326530612e-05,
1306
+ "loss": 2.2724,
1307
+ "step": 16700
1308
+ },
1309
+ {
1310
+ "epoch": 24.82,
1311
+ "learning_rate": 4.017268445839874e-05,
1312
+ "loss": 2.2922,
1313
+ "step": 16800
1314
+ },
1315
+ {
1316
+ "epoch": 24.96,
1317
+ "learning_rate": 3.993720565149136e-05,
1318
+ "loss": 2.2934,
1319
+ "step": 16900
1320
+ },
1321
+ {
1322
+ "epoch": 25.11,
1323
+ "learning_rate": 3.970172684458398e-05,
1324
+ "loss": 2.2766,
1325
+ "step": 17000
1326
+ },
1327
+ {
1328
+ "epoch": 25.11,
1329
+ "eval_loss": 1.0882744789123535,
1330
+ "eval_runtime": 117.2941,
1331
+ "eval_samples_per_second": 17.23,
1332
+ "eval_steps_per_second": 2.157,
1333
+ "eval_wer": 2.1761504205838693,
1334
+ "step": 17000
1335
+ },
1336
+ {
1337
+ "epoch": 25.26,
1338
+ "learning_rate": 3.946624803767661e-05,
1339
+ "loss": 2.2656,
1340
+ "step": 17100
1341
+ },
1342
+ {
1343
+ "epoch": 25.41,
1344
+ "learning_rate": 3.9230769230769226e-05,
1345
+ "loss": 2.2929,
1346
+ "step": 17200
1347
+ },
1348
+ {
1349
+ "epoch": 25.55,
1350
+ "learning_rate": 3.899529042386185e-05,
1351
+ "loss": 2.2513,
1352
+ "step": 17300
1353
+ },
1354
+ {
1355
+ "epoch": 25.7,
1356
+ "learning_rate": 3.875981161695447e-05,
1357
+ "loss": 2.2603,
1358
+ "step": 17400
1359
+ },
1360
+ {
1361
+ "epoch": 25.85,
1362
+ "learning_rate": 3.852433281004709e-05,
1363
+ "loss": 2.2534,
1364
+ "step": 17500
1365
+ },
1366
+ {
1367
+ "epoch": 25.85,
1368
+ "eval_loss": 1.0743526220321655,
1369
+ "eval_runtime": 118.2043,
1370
+ "eval_samples_per_second": 17.098,
1371
+ "eval_steps_per_second": 2.14,
1372
+ "eval_wer": 2.1875309252845128,
1373
+ "step": 17500
1374
+ },
1375
+ {
1376
+ "epoch": 26.0,
1377
+ "learning_rate": 3.8288854003139713e-05,
1378
+ "loss": 2.2716,
1379
+ "step": 17600
1380
+ },
1381
+ {
1382
+ "epoch": 26.14,
1383
+ "learning_rate": 3.8053375196232335e-05,
1384
+ "loss": 2.2486,
1385
+ "step": 17700
1386
+ },
1387
+ {
1388
+ "epoch": 26.29,
1389
+ "learning_rate": 3.781789638932496e-05,
1390
+ "loss": 2.2068,
1391
+ "step": 17800
1392
+ },
1393
+ {
1394
+ "epoch": 26.44,
1395
+ "learning_rate": 3.758241758241758e-05,
1396
+ "loss": 2.2431,
1397
+ "step": 17900
1398
+ },
1399
+ {
1400
+ "epoch": 26.59,
1401
+ "learning_rate": 3.73469387755102e-05,
1402
+ "loss": 2.2393,
1403
+ "step": 18000
1404
+ },
1405
+ {
1406
+ "epoch": 26.59,
1407
+ "eval_loss": 1.0561192035675049,
1408
+ "eval_runtime": 116.8996,
1409
+ "eval_samples_per_second": 17.288,
1410
+ "eval_steps_per_second": 2.164,
1411
+ "eval_wer": 2.1845620979713014,
1412
+ "step": 18000
1413
+ },
1414
+ {
1415
+ "epoch": 26.74,
1416
+ "learning_rate": 3.711145996860282e-05,
1417
+ "loss": 2.1944,
1418
+ "step": 18100
1419
+ },
1420
+ {
1421
+ "epoch": 26.88,
1422
+ "learning_rate": 3.6875981161695444e-05,
1423
+ "loss": 2.2359,
1424
+ "step": 18200
1425
+ },
1426
+ {
1427
+ "epoch": 27.03,
1428
+ "learning_rate": 3.664285714285714e-05,
1429
+ "loss": 2.2097,
1430
+ "step": 18300
1431
+ },
1432
+ {
1433
+ "epoch": 27.18,
1434
+ "learning_rate": 3.640737833594976e-05,
1435
+ "loss": 2.1431,
1436
+ "step": 18400
1437
+ },
1438
+ {
1439
+ "epoch": 27.33,
1440
+ "learning_rate": 3.617189952904238e-05,
1441
+ "loss": 2.2085,
1442
+ "step": 18500
1443
+ },
1444
+ {
1445
+ "epoch": 27.33,
1446
+ "eval_loss": 1.0465816259384155,
1447
+ "eval_runtime": 115.87,
1448
+ "eval_samples_per_second": 17.442,
1449
+ "eval_steps_per_second": 2.183,
1450
+ "eval_wer": 2.1444829292429493,
1451
+ "step": 18500
1452
+ },
1453
+ {
1454
+ "epoch": 27.47,
1455
+ "learning_rate": 3.5936420722135003e-05,
1456
+ "loss": 2.2204,
1457
+ "step": 18600
1458
+ },
1459
+ {
1460
+ "epoch": 27.62,
1461
+ "learning_rate": 3.5700941915227625e-05,
1462
+ "loss": 2.242,
1463
+ "step": 18700
1464
+ },
1465
+ {
1466
+ "epoch": 27.77,
1467
+ "learning_rate": 3.546546310832025e-05,
1468
+ "loss": 2.1699,
1469
+ "step": 18800
1470
+ },
1471
+ {
1472
+ "epoch": 27.92,
1473
+ "learning_rate": 3.522998430141287e-05,
1474
+ "loss": 2.2152,
1475
+ "step": 18900
1476
+ },
1477
+ {
1478
+ "epoch": 28.06,
1479
+ "learning_rate": 3.499450549450549e-05,
1480
+ "loss": 2.1966,
1481
+ "step": 19000
1482
+ },
1483
+ {
1484
+ "epoch": 28.06,
1485
+ "eval_loss": 1.0382250547409058,
1486
+ "eval_runtime": 116.4655,
1487
+ "eval_samples_per_second": 17.353,
1488
+ "eval_steps_per_second": 2.172,
1489
+ "eval_wer": 2.1088570014844135,
1490
+ "step": 19000
1491
+ },
1492
+ {
1493
+ "epoch": 28.21,
1494
+ "learning_rate": 3.475902668759811e-05,
1495
+ "loss": 2.169,
1496
+ "step": 19100
1497
+ },
1498
+ {
1499
+ "epoch": 28.36,
1500
+ "learning_rate": 3.4523547880690734e-05,
1501
+ "loss": 2.1981,
1502
+ "step": 19200
1503
+ },
1504
+ {
1505
+ "epoch": 28.51,
1506
+ "learning_rate": 3.4288069073783356e-05,
1507
+ "loss": 2.1692,
1508
+ "step": 19300
1509
+ },
1510
+ {
1511
+ "epoch": 28.66,
1512
+ "learning_rate": 3.405259026687598e-05,
1513
+ "loss": 2.1931,
1514
+ "step": 19400
1515
+ },
1516
+ {
1517
+ "epoch": 28.8,
1518
+ "learning_rate": 3.38171114599686e-05,
1519
+ "loss": 2.1794,
1520
+ "step": 19500
1521
+ },
1522
+ {
1523
+ "epoch": 28.8,
1524
+ "eval_loss": 1.0263785123825073,
1525
+ "eval_runtime": 114.5988,
1526
+ "eval_samples_per_second": 17.635,
1527
+ "eval_steps_per_second": 2.208,
1528
+ "eval_wer": 1.9861454725383474,
1529
+ "step": 19500
1530
+ },
1531
+ {
1532
+ "epoch": 28.95,
1533
+ "learning_rate": 3.358163265306122e-05,
1534
+ "loss": 2.1638,
1535
+ "step": 19600
1536
+ },
1537
+ {
1538
+ "epoch": 29.1,
1539
+ "learning_rate": 3.334615384615384e-05,
1540
+ "loss": 2.1714,
1541
+ "step": 19700
1542
+ },
1543
+ {
1544
+ "epoch": 29.25,
1545
+ "learning_rate": 3.3110675039246465e-05,
1546
+ "loss": 2.1514,
1547
+ "step": 19800
1548
+ },
1549
+ {
1550
+ "epoch": 29.39,
1551
+ "learning_rate": 3.2875196232339087e-05,
1552
+ "loss": 2.1374,
1553
+ "step": 19900
1554
+ },
1555
+ {
1556
+ "epoch": 29.54,
1557
+ "learning_rate": 3.263971742543171e-05,
1558
+ "loss": 2.1423,
1559
+ "step": 20000
1560
+ },
1561
+ {
1562
+ "epoch": 29.54,
1563
+ "eval_loss": 1.0245550870895386,
1564
+ "eval_runtime": 116.8375,
1565
+ "eval_samples_per_second": 17.298,
1566
+ "eval_steps_per_second": 2.165,
1567
+ "eval_wer": 1.9678377041068777,
1568
+ "step": 20000
1569
+ },
1570
+ {
1571
+ "epoch": 29.69,
1572
+ "learning_rate": 3.240423861852433e-05,
1573
+ "loss": 2.1807,
1574
+ "step": 20100
1575
+ },
1576
+ {
1577
+ "epoch": 29.84,
1578
+ "learning_rate": 3.216875981161695e-05,
1579
+ "loss": 2.1545,
1580
+ "step": 20200
1581
+ },
1582
+ {
1583
+ "epoch": 29.98,
1584
+ "learning_rate": 3.1933281004709574e-05,
1585
+ "loss": 2.1404,
1586
+ "step": 20300
1587
+ },
1588
+ {
1589
+ "epoch": 30.13,
1590
+ "learning_rate": 3.1697802197802195e-05,
1591
+ "loss": 2.1089,
1592
+ "step": 20400
1593
+ },
1594
+ {
1595
+ "epoch": 30.28,
1596
+ "learning_rate": 3.146232339089482e-05,
1597
+ "loss": 2.1649,
1598
+ "step": 20500
1599
+ },
1600
+ {
1601
+ "epoch": 30.28,
1602
+ "eval_loss": 0.9981661438941956,
1603
+ "eval_runtime": 116.056,
1604
+ "eval_samples_per_second": 17.414,
1605
+ "eval_steps_per_second": 2.18,
1606
+ "eval_wer": 2.000494804552202,
1607
+ "step": 20500
1608
+ },
1609
+ {
1610
+ "epoch": 30.43,
1611
+ "learning_rate": 3.122684458398744e-05,
1612
+ "loss": 2.1425,
1613
+ "step": 20600
1614
+ },
1615
+ {
1616
+ "epoch": 30.58,
1617
+ "learning_rate": 3.099136577708006e-05,
1618
+ "loss": 2.1357,
1619
+ "step": 20700
1620
+ },
1621
+ {
1622
+ "epoch": 30.72,
1623
+ "learning_rate": 3.0758241758241755e-05,
1624
+ "loss": 2.1251,
1625
+ "step": 20800
1626
+ },
1627
+ {
1628
+ "epoch": 30.87,
1629
+ "learning_rate": 3.052276295133438e-05,
1630
+ "loss": 2.1256,
1631
+ "step": 20900
1632
+ },
1633
+ {
1634
+ "epoch": 31.02,
1635
+ "learning_rate": 3.0287284144427e-05,
1636
+ "loss": 2.143,
1637
+ "step": 21000
1638
+ },
1639
+ {
1640
+ "epoch": 31.02,
1641
+ "eval_loss": 0.9985482692718506,
1642
+ "eval_runtime": 116.0424,
1643
+ "eval_samples_per_second": 17.416,
1644
+ "eval_steps_per_second": 2.18,
1645
+ "eval_wer": 2.045027214250371,
1646
+ "step": 21000
1647
+ },
1648
+ {
1649
+ "epoch": 31.17,
1650
+ "learning_rate": 3.005180533751962e-05,
1651
+ "loss": 2.0744,
1652
+ "step": 21100
1653
+ },
1654
+ {
1655
+ "epoch": 31.31,
1656
+ "learning_rate": 2.9816326530612242e-05,
1657
+ "loss": 2.0831,
1658
+ "step": 21200
1659
+ },
1660
+ {
1661
+ "epoch": 31.46,
1662
+ "learning_rate": 2.9583202511773936e-05,
1663
+ "loss": 2.1254,
1664
+ "step": 21300
1665
+ },
1666
+ {
1667
+ "epoch": 31.61,
1668
+ "learning_rate": 2.934772370486656e-05,
1669
+ "loss": 2.1357,
1670
+ "step": 21400
1671
+ },
1672
+ {
1673
+ "epoch": 31.76,
1674
+ "learning_rate": 2.911224489795918e-05,
1675
+ "loss": 2.1338,
1676
+ "step": 21500
1677
+ },
1678
+ {
1679
+ "epoch": 31.76,
1680
+ "eval_loss": 0.9932034611701965,
1681
+ "eval_runtime": 114.6961,
1682
+ "eval_samples_per_second": 17.62,
1683
+ "eval_steps_per_second": 2.206,
1684
+ "eval_wer": 2.0024740227610094,
1685
+ "step": 21500
1686
+ },
1687
+ {
1688
+ "epoch": 31.91,
1689
+ "learning_rate": 2.8876766091051805e-05,
1690
+ "loss": 2.1053,
1691
+ "step": 21600
1692
+ },
1693
+ {
1694
+ "epoch": 32.05,
1695
+ "learning_rate": 2.8641287284144426e-05,
1696
+ "loss": 2.1111,
1697
+ "step": 21700
1698
+ },
1699
+ {
1700
+ "epoch": 32.2,
1701
+ "learning_rate": 2.8405808477237045e-05,
1702
+ "loss": 2.1028,
1703
+ "step": 21800
1704
+ },
1705
+ {
1706
+ "epoch": 32.35,
1707
+ "learning_rate": 2.817032967032967e-05,
1708
+ "loss": 2.0879,
1709
+ "step": 21900
1710
+ },
1711
+ {
1712
+ "epoch": 32.5,
1713
+ "learning_rate": 2.793485086342229e-05,
1714
+ "loss": 2.1076,
1715
+ "step": 22000
1716
+ },
1717
+ {
1718
+ "epoch": 32.5,
1719
+ "eval_loss": 0.9902665019035339,
1720
+ "eval_runtime": 120.6987,
1721
+ "eval_samples_per_second": 16.744,
1722
+ "eval_steps_per_second": 2.096,
1723
+ "eval_wer": 2.0504700643245917,
1724
+ "step": 22000
1725
+ },
1726
+ {
1727
+ "epoch": 32.64,
1728
+ "learning_rate": 2.769937205651491e-05,
1729
+ "loss": 2.1107,
1730
+ "step": 22100
1731
+ },
1732
+ {
1733
+ "epoch": 32.79,
1734
+ "learning_rate": 2.7463893249607535e-05,
1735
+ "loss": 2.0953,
1736
+ "step": 22200
1737
+ },
1738
+ {
1739
+ "epoch": 32.94,
1740
+ "learning_rate": 2.7228414442700154e-05,
1741
+ "loss": 2.0619,
1742
+ "step": 22300
1743
+ },
1744
+ {
1745
+ "epoch": 33.09,
1746
+ "learning_rate": 2.6992935635792776e-05,
1747
+ "loss": 2.0531,
1748
+ "step": 22400
1749
+ },
1750
+ {
1751
+ "epoch": 33.23,
1752
+ "learning_rate": 2.6757456828885397e-05,
1753
+ "loss": 2.0519,
1754
+ "step": 22500
1755
+ },
1756
+ {
1757
+ "epoch": 33.23,
1758
+ "eval_loss": 0.9833839535713196,
1759
+ "eval_runtime": 116.5317,
1760
+ "eval_samples_per_second": 17.343,
1761
+ "eval_steps_per_second": 2.171,
1762
+ "eval_wer": 2.07372587827808,
1763
+ "step": 22500
1764
+ },
1765
+ {
1766
+ "epoch": 33.38,
1767
+ "learning_rate": 2.652197802197802e-05,
1768
+ "loss": 2.0493,
1769
+ "step": 22600
1770
+ },
1771
+ {
1772
+ "epoch": 33.53,
1773
+ "learning_rate": 2.6286499215070644e-05,
1774
+ "loss": 2.0749,
1775
+ "step": 22700
1776
+ },
1777
+ {
1778
+ "epoch": 33.68,
1779
+ "learning_rate": 2.6051020408163263e-05,
1780
+ "loss": 2.0838,
1781
+ "step": 22800
1782
+ },
1783
+ {
1784
+ "epoch": 33.83,
1785
+ "learning_rate": 2.5815541601255884e-05,
1786
+ "loss": 2.0629,
1787
+ "step": 22900
1788
+ },
1789
+ {
1790
+ "epoch": 33.97,
1791
+ "learning_rate": 2.5580062794348506e-05,
1792
+ "loss": 2.0534,
1793
+ "step": 23000
1794
+ },
1795
+ {
1796
+ "epoch": 33.97,
1797
+ "eval_loss": 0.9755652546882629,
1798
+ "eval_runtime": 114.923,
1799
+ "eval_samples_per_second": 17.586,
1800
+ "eval_steps_per_second": 2.201,
1801
+ "eval_wer": 2.024740227610094,
1802
+ "step": 23000
1803
+ },
1804
+ {
1805
+ "epoch": 34.12,
1806
+ "learning_rate": 2.5344583987441128e-05,
1807
+ "loss": 2.067,
1808
+ "step": 23100
1809
+ },
1810
+ {
1811
+ "epoch": 34.27,
1812
+ "learning_rate": 2.5109105180533746e-05,
1813
+ "loss": 2.0252,
1814
+ "step": 23200
1815
+ },
1816
+ {
1817
+ "epoch": 34.42,
1818
+ "learning_rate": 2.487362637362637e-05,
1819
+ "loss": 2.0483,
1820
+ "step": 23300
1821
+ },
1822
+ {
1823
+ "epoch": 34.56,
1824
+ "learning_rate": 2.4638147566718993e-05,
1825
+ "loss": 2.0464,
1826
+ "step": 23400
1827
+ },
1828
+ {
1829
+ "epoch": 34.71,
1830
+ "learning_rate": 2.4402668759811615e-05,
1831
+ "loss": 2.0121,
1832
+ "step": 23500
1833
+ },
1834
+ {
1835
+ "epoch": 34.71,
1836
+ "eval_loss": 0.968792736530304,
1837
+ "eval_runtime": 114.3088,
1838
+ "eval_samples_per_second": 17.68,
1839
+ "eval_steps_per_second": 2.213,
1840
+ "eval_wer": 2.1439881246907473,
1841
+ "step": 23500
1842
+ },
1843
+ {
1844
+ "epoch": 34.86,
1845
+ "learning_rate": 2.4167189952904237e-05,
1846
+ "loss": 2.036,
1847
+ "step": 23600
1848
+ },
1849
+ {
1850
+ "epoch": 35.01,
1851
+ "learning_rate": 2.3931711145996855e-05,
1852
+ "loss": 2.013,
1853
+ "step": 23700
1854
+ },
1855
+ {
1856
+ "epoch": 35.16,
1857
+ "learning_rate": 2.369623233908948e-05,
1858
+ "loss": 2.0043,
1859
+ "step": 23800
1860
+ },
1861
+ {
1862
+ "epoch": 35.3,
1863
+ "learning_rate": 2.3460753532182102e-05,
1864
+ "loss": 2.037,
1865
+ "step": 23900
1866
+ },
1867
+ {
1868
+ "epoch": 35.45,
1869
+ "learning_rate": 2.322527472527472e-05,
1870
+ "loss": 2.0161,
1871
+ "step": 24000
1872
+ },
1873
+ {
1874
+ "epoch": 35.45,
1875
+ "eval_loss": 0.9581586718559265,
1876
+ "eval_runtime": 115.925,
1877
+ "eval_samples_per_second": 17.434,
1878
+ "eval_steps_per_second": 2.182,
1879
+ "eval_wer": 2.1232063334982683,
1880
+ "step": 24000
1881
+ },
1882
+ {
1883
+ "epoch": 35.6,
1884
+ "learning_rate": 2.2989795918367346e-05,
1885
+ "loss": 2.0256,
1886
+ "step": 24100
1887
+ },
1888
+ {
1889
+ "epoch": 35.75,
1890
+ "learning_rate": 2.2754317111459968e-05,
1891
+ "loss": 2.0265,
1892
+ "step": 24200
1893
+ },
1894
+ {
1895
+ "epoch": 35.89,
1896
+ "learning_rate": 2.251883830455259e-05,
1897
+ "loss": 2.0298,
1898
+ "step": 24300
1899
+ },
1900
+ {
1901
+ "epoch": 36.04,
1902
+ "learning_rate": 2.228335949764521e-05,
1903
+ "loss": 2.0028,
1904
+ "step": 24400
1905
+ },
1906
+ {
1907
+ "epoch": 36.19,
1908
+ "learning_rate": 2.204788069073783e-05,
1909
+ "loss": 2.0178,
1910
+ "step": 24500
1911
+ },
1912
+ {
1913
+ "epoch": 36.19,
1914
+ "eval_loss": 0.9480372071266174,
1915
+ "eval_runtime": 116.8212,
1916
+ "eval_samples_per_second": 17.3,
1917
+ "eval_steps_per_second": 2.166,
1918
+ "eval_wer": 2.0895596239485403,
1919
+ "step": 24500
1920
+ },
1921
+ {
1922
+ "epoch": 36.34,
1923
+ "learning_rate": 2.1812401883830455e-05,
1924
+ "loss": 2.008,
1925
+ "step": 24600
1926
+ },
1927
+ {
1928
+ "epoch": 36.48,
1929
+ "learning_rate": 2.1576923076923076e-05,
1930
+ "loss": 2.0132,
1931
+ "step": 24700
1932
+ },
1933
+ {
1934
+ "epoch": 36.63,
1935
+ "learning_rate": 2.1341444270015695e-05,
1936
+ "loss": 2.0204,
1937
+ "step": 24800
1938
+ },
1939
+ {
1940
+ "epoch": 36.78,
1941
+ "learning_rate": 2.110596546310832e-05,
1942
+ "loss": 1.9806,
1943
+ "step": 24900
1944
+ },
1945
+ {
1946
+ "epoch": 36.93,
1947
+ "learning_rate": 2.087048665620094e-05,
1948
+ "loss": 2.0154,
1949
+ "step": 25000
1950
+ },
1951
+ {
1952
+ "epoch": 36.93,
1953
+ "eval_loss": 0.9483017325401306,
1954
+ "eval_runtime": 117.4294,
1955
+ "eval_samples_per_second": 17.21,
1956
+ "eval_steps_per_second": 2.154,
1957
+ "eval_wer": 2.078673923800099,
1958
+ "step": 25000
1959
+ },
1960
+ {
1961
+ "epoch": 37.08,
1962
+ "learning_rate": 2.063500784929356e-05,
1963
+ "loss": 1.997,
1964
+ "step": 25100
1965
+ },
1966
+ {
1967
+ "epoch": 37.22,
1968
+ "learning_rate": 2.0399529042386185e-05,
1969
+ "loss": 1.9712,
1970
+ "step": 25200
1971
+ },
1972
+ {
1973
+ "epoch": 37.37,
1974
+ "learning_rate": 2.0164050235478804e-05,
1975
+ "loss": 2.0131,
1976
+ "step": 25300
1977
+ },
1978
+ {
1979
+ "epoch": 37.52,
1980
+ "learning_rate": 1.992857142857143e-05,
1981
+ "loss": 1.9605,
1982
+ "step": 25400
1983
+ },
1984
+ {
1985
+ "epoch": 37.67,
1986
+ "learning_rate": 1.9695447409733123e-05,
1987
+ "loss": 1.9966,
1988
+ "step": 25500
1989
+ },
1990
+ {
1991
+ "epoch": 37.67,
1992
+ "eval_loss": 0.940608024597168,
1993
+ "eval_runtime": 115.2635,
1994
+ "eval_samples_per_second": 17.534,
1995
+ "eval_steps_per_second": 2.195,
1996
+ "eval_wer": 2.0296882731321126,
1997
+ "step": 25500
1998
+ },
1999
+ {
2000
+ "epoch": 37.81,
2001
+ "learning_rate": 1.945996860282574e-05,
2002
+ "loss": 1.9879,
2003
+ "step": 25600
2004
+ },
2005
+ {
2006
+ "epoch": 37.96,
2007
+ "learning_rate": 1.9224489795918367e-05,
2008
+ "loss": 1.9836,
2009
+ "step": 25700
2010
+ },
2011
+ {
2012
+ "epoch": 38.11,
2013
+ "learning_rate": 1.8989010989010988e-05,
2014
+ "loss": 1.9872,
2015
+ "step": 25800
2016
+ },
2017
+ {
2018
+ "epoch": 38.26,
2019
+ "learning_rate": 1.8753532182103607e-05,
2020
+ "loss": 1.9684,
2021
+ "step": 25900
2022
+ },
2023
+ {
2024
+ "epoch": 38.4,
2025
+ "learning_rate": 1.851805337519623e-05,
2026
+ "loss": 1.9753,
2027
+ "step": 26000
2028
+ },
2029
+ {
2030
+ "epoch": 38.4,
2031
+ "eval_loss": 0.9418594837188721,
2032
+ "eval_runtime": 115.7124,
2033
+ "eval_samples_per_second": 17.466,
2034
+ "eval_steps_per_second": 2.186,
2035
+ "eval_wer": 2.0346363186541314,
2036
+ "step": 26000
2037
+ },
2038
+ {
2039
+ "epoch": 38.55,
2040
+ "learning_rate": 1.828257456828885e-05,
2041
+ "loss": 1.9926,
2042
+ "step": 26100
2043
+ },
2044
+ {
2045
+ "epoch": 38.7,
2046
+ "learning_rate": 1.8047095761381475e-05,
2047
+ "loss": 1.9685,
2048
+ "step": 26200
2049
+ },
2050
+ {
2051
+ "epoch": 38.85,
2052
+ "learning_rate": 1.7811616954474097e-05,
2053
+ "loss": 1.9707,
2054
+ "step": 26300
2055
+ },
2056
+ {
2057
+ "epoch": 39.0,
2058
+ "learning_rate": 1.7576138147566716e-05,
2059
+ "loss": 1.9477,
2060
+ "step": 26400
2061
+ },
2062
+ {
2063
+ "epoch": 39.14,
2064
+ "learning_rate": 1.7340659340659337e-05,
2065
+ "loss": 1.9524,
2066
+ "step": 26500
2067
+ },
2068
+ {
2069
+ "epoch": 39.14,
2070
+ "eval_loss": 0.927354097366333,
2071
+ "eval_runtime": 115.8614,
2072
+ "eval_samples_per_second": 17.443,
2073
+ "eval_steps_per_second": 2.184,
2074
+ "eval_wer": 2.0697674418604652,
2075
+ "step": 26500
2076
+ },
2077
+ {
2078
+ "epoch": 39.29,
2079
+ "learning_rate": 1.7105180533751963e-05,
2080
+ "loss": 1.9673,
2081
+ "step": 26600
2082
+ },
2083
+ {
2084
+ "epoch": 39.44,
2085
+ "learning_rate": 1.6869701726844584e-05,
2086
+ "loss": 1.9802,
2087
+ "step": 26700
2088
+ },
2089
+ {
2090
+ "epoch": 39.59,
2091
+ "learning_rate": 1.6634222919937203e-05,
2092
+ "loss": 1.9408,
2093
+ "step": 26800
2094
+ },
2095
+ {
2096
+ "epoch": 39.73,
2097
+ "learning_rate": 1.6398744113029824e-05,
2098
+ "loss": 1.9482,
2099
+ "step": 26900
2100
+ },
2101
+ {
2102
+ "epoch": 39.88,
2103
+ "learning_rate": 1.6163265306122446e-05,
2104
+ "loss": 1.9427,
2105
+ "step": 27000
2106
+ },
2107
+ {
2108
+ "epoch": 39.88,
2109
+ "eval_loss": 0.9232719540596008,
2110
+ "eval_runtime": 116.3191,
2111
+ "eval_samples_per_second": 17.375,
2112
+ "eval_steps_per_second": 2.175,
2113
+ "eval_wer": 2.078673923800099,
2114
+ "step": 27000
2115
+ },
2116
+ {
2117
+ "epoch": 40.03,
2118
+ "learning_rate": 1.592778649921507e-05,
2119
+ "loss": 1.9653,
2120
+ "step": 27100
2121
+ },
2122
+ {
2123
+ "epoch": 40.18,
2124
+ "learning_rate": 1.569230769230769e-05,
2125
+ "loss": 1.9157,
2126
+ "step": 27200
2127
+ },
2128
+ {
2129
+ "epoch": 40.32,
2130
+ "learning_rate": 1.545682888540031e-05,
2131
+ "loss": 1.9493,
2132
+ "step": 27300
2133
+ },
2134
+ {
2135
+ "epoch": 40.47,
2136
+ "learning_rate": 1.5221350078492935e-05,
2137
+ "loss": 1.8974,
2138
+ "step": 27400
2139
+ },
2140
+ {
2141
+ "epoch": 40.62,
2142
+ "learning_rate": 1.4985871271585557e-05,
2143
+ "loss": 1.9258,
2144
+ "step": 27500
2145
+ },
2146
+ {
2147
+ "epoch": 40.62,
2148
+ "eval_loss": 0.9182448983192444,
2149
+ "eval_runtime": 115.4065,
2150
+ "eval_samples_per_second": 17.512,
2151
+ "eval_steps_per_second": 2.192,
2152
+ "eval_wer": 2.052944087085601,
2153
+ "step": 27500
2154
+ },
2155
+ {
2156
+ "epoch": 40.77,
2157
+ "learning_rate": 1.4750392464678177e-05,
2158
+ "loss": 1.9354,
2159
+ "step": 27600
2160
+ },
2161
+ {
2162
+ "epoch": 40.92,
2163
+ "learning_rate": 1.4514913657770799e-05,
2164
+ "loss": 1.952,
2165
+ "step": 27700
2166
+ },
2167
+ {
2168
+ "epoch": 41.06,
2169
+ "learning_rate": 1.4281789638932496e-05,
2170
+ "loss": 1.9231,
2171
+ "step": 27800
2172
+ },
2173
+ {
2174
+ "epoch": 41.21,
2175
+ "learning_rate": 1.4046310832025116e-05,
2176
+ "loss": 1.9465,
2177
+ "step": 27900
2178
+ },
2179
+ {
2180
+ "epoch": 41.36,
2181
+ "learning_rate": 1.3810832025117738e-05,
2182
+ "loss": 1.9031,
2183
+ "step": 28000
2184
+ },
2185
+ {
2186
+ "epoch": 41.36,
2187
+ "eval_loss": 0.9149593114852905,
2188
+ "eval_runtime": 116.2555,
2189
+ "eval_samples_per_second": 17.384,
2190
+ "eval_steps_per_second": 2.176,
2191
+ "eval_wer": 2.078673923800099,
2192
+ "step": 28000
2193
+ },
2194
+ {
2195
+ "epoch": 41.51,
2196
+ "learning_rate": 1.357535321821036e-05,
2197
+ "loss": 1.9361,
2198
+ "step": 28100
2199
+ },
2200
+ {
2201
+ "epoch": 41.65,
2202
+ "learning_rate": 1.3342229199372054e-05,
2203
+ "loss": 1.916,
2204
+ "step": 28200
2205
+ },
2206
+ {
2207
+ "epoch": 41.8,
2208
+ "learning_rate": 1.3106750392464677e-05,
2209
+ "loss": 1.9149,
2210
+ "step": 28300
2211
+ },
2212
+ {
2213
+ "epoch": 41.95,
2214
+ "learning_rate": 1.2871271585557299e-05,
2215
+ "loss": 1.9037,
2216
+ "step": 28400
2217
+ },
2218
+ {
2219
+ "epoch": 42.1,
2220
+ "learning_rate": 1.263579277864992e-05,
2221
+ "loss": 1.9297,
2222
+ "step": 28500
2223
+ },
2224
+ {
2225
+ "epoch": 42.1,
2226
+ "eval_loss": 0.9040070176124573,
2227
+ "eval_runtime": 113.8901,
2228
+ "eval_samples_per_second": 17.745,
2229
+ "eval_steps_per_second": 2.221,
2230
+ "eval_wer": 2.0504700643245917,
2231
+ "step": 28500
2232
+ },
2233
+ {
2234
+ "epoch": 42.25,
2235
+ "learning_rate": 1.2400313971742541e-05,
2236
+ "loss": 1.8855,
2237
+ "step": 28600
2238
+ },
2239
+ {
2240
+ "epoch": 42.39,
2241
+ "learning_rate": 1.2164835164835163e-05,
2242
+ "loss": 1.9095,
2243
+ "step": 28700
2244
+ },
2245
+ {
2246
+ "epoch": 42.54,
2247
+ "learning_rate": 1.1929356357927786e-05,
2248
+ "loss": 1.8913,
2249
+ "step": 28800
2250
+ },
2251
+ {
2252
+ "epoch": 42.69,
2253
+ "learning_rate": 1.1693877551020408e-05,
2254
+ "loss": 1.8685,
2255
+ "step": 28900
2256
+ },
2257
+ {
2258
+ "epoch": 42.84,
2259
+ "learning_rate": 1.1458398744113028e-05,
2260
+ "loss": 1.9041,
2261
+ "step": 29000
2262
+ },
2263
+ {
2264
+ "epoch": 42.84,
2265
+ "eval_loss": 0.9008907675743103,
2266
+ "eval_runtime": 114.9643,
2267
+ "eval_samples_per_second": 17.579,
2268
+ "eval_steps_per_second": 2.201,
2269
+ "eval_wer": 2.05789213260762,
2270
+ "step": 29000
2271
+ },
2272
+ {
2273
+ "epoch": 42.98,
2274
+ "learning_rate": 1.122291993720565e-05,
2275
+ "loss": 1.8963,
2276
+ "step": 29100
2277
+ },
2278
+ {
2279
+ "epoch": 43.13,
2280
+ "learning_rate": 1.0987441130298273e-05,
2281
+ "loss": 1.9068,
2282
+ "step": 29200
2283
+ },
2284
+ {
2285
+ "epoch": 43.28,
2286
+ "learning_rate": 1.0751962323390895e-05,
2287
+ "loss": 1.9003,
2288
+ "step": 29300
2289
+ },
2290
+ {
2291
+ "epoch": 43.43,
2292
+ "learning_rate": 1.0516483516483515e-05,
2293
+ "loss": 1.891,
2294
+ "step": 29400
2295
+ },
2296
+ {
2297
+ "epoch": 43.57,
2298
+ "learning_rate": 1.0281004709576137e-05,
2299
+ "loss": 1.8929,
2300
+ "step": 29500
2301
+ },
2302
+ {
2303
+ "epoch": 43.57,
2304
+ "eval_loss": 0.8968304991722107,
2305
+ "eval_runtime": 116.4378,
2306
+ "eval_samples_per_second": 17.357,
2307
+ "eval_steps_per_second": 2.173,
2308
+ "eval_wer": 2.032657100445324,
2309
+ "step": 29500
2310
+ },
2311
+ {
2312
+ "epoch": 43.72,
2313
+ "learning_rate": 1.0045525902668759e-05,
2314
+ "loss": 1.8827,
2315
+ "step": 29600
2316
+ },
2317
+ {
2318
+ "epoch": 43.87,
2319
+ "learning_rate": 9.810047095761382e-06,
2320
+ "loss": 1.8862,
2321
+ "step": 29700
2322
+ },
2323
+ {
2324
+ "epoch": 44.02,
2325
+ "learning_rate": 9.574568288854002e-06,
2326
+ "loss": 1.8787,
2327
+ "step": 29800
2328
+ },
2329
+ {
2330
+ "epoch": 44.17,
2331
+ "learning_rate": 9.339089481946624e-06,
2332
+ "loss": 1.8501,
2333
+ "step": 29900
2334
+ },
2335
+ {
2336
+ "epoch": 44.31,
2337
+ "learning_rate": 9.103610675039246e-06,
2338
+ "loss": 1.9077,
2339
+ "step": 30000
2340
+ },
2341
+ {
2342
+ "epoch": 44.31,
2343
+ "eval_loss": 0.8953686952590942,
2344
+ "eval_runtime": 115.4838,
2345
+ "eval_samples_per_second": 17.5,
2346
+ "eval_steps_per_second": 2.191,
2347
+ "eval_wer": 2.061850569025235,
2348
+ "step": 30000
2349
+ },
2350
+ {
2351
+ "epoch": 44.46,
2352
+ "learning_rate": 8.868131868131868e-06,
2353
+ "loss": 1.8804,
2354
+ "step": 30100
2355
+ },
2356
+ {
2357
+ "epoch": 44.61,
2358
+ "learning_rate": 8.63265306122449e-06,
2359
+ "loss": 1.8723,
2360
+ "step": 30200
2361
+ },
2362
+ {
2363
+ "epoch": 44.76,
2364
+ "learning_rate": 8.397174254317111e-06,
2365
+ "loss": 1.8577,
2366
+ "step": 30300
2367
+ },
2368
+ {
2369
+ "epoch": 44.9,
2370
+ "learning_rate": 8.161695447409733e-06,
2371
+ "loss": 1.8811,
2372
+ "step": 30400
2373
+ },
2374
+ {
2375
+ "epoch": 45.05,
2376
+ "learning_rate": 7.928571428571429e-06,
2377
+ "loss": 1.8504,
2378
+ "step": 30500
2379
+ },
2380
+ {
2381
+ "epoch": 45.05,
2382
+ "eval_loss": 0.892192542552948,
2383
+ "eval_runtime": 116.2513,
2384
+ "eval_samples_per_second": 17.385,
2385
+ "eval_steps_per_second": 2.176,
2386
+ "eval_wer": 2.07372587827808,
2387
+ "step": 30500
2388
+ },
2389
+ {
2390
+ "epoch": 45.2,
2391
+ "learning_rate": 7.693092621664049e-06,
2392
+ "loss": 1.861,
2393
+ "step": 30600
2394
+ },
2395
+ {
2396
+ "epoch": 45.35,
2397
+ "learning_rate": 7.457613814756671e-06,
2398
+ "loss": 1.8496,
2399
+ "step": 30700
2400
+ },
2401
+ {
2402
+ "epoch": 45.49,
2403
+ "learning_rate": 7.222135007849293e-06,
2404
+ "loss": 1.8612,
2405
+ "step": 30800
2406
+ },
2407
+ {
2408
+ "epoch": 45.64,
2409
+ "learning_rate": 6.986656200941915e-06,
2410
+ "loss": 1.865,
2411
+ "step": 30900
2412
+ },
2413
+ {
2414
+ "epoch": 45.79,
2415
+ "learning_rate": 6.751177394034536e-06,
2416
+ "loss": 1.8732,
2417
+ "step": 31000
2418
+ },
2419
+ {
2420
+ "epoch": 45.79,
2421
+ "eval_loss": 0.8897548317909241,
2422
+ "eval_runtime": 116.5927,
2423
+ "eval_samples_per_second": 17.334,
2424
+ "eval_steps_per_second": 2.17,
2425
+ "eval_wer": 2.0682830282038593,
2426
+ "step": 31000
2427
+ },
2428
+ {
2429
+ "epoch": 45.94,
2430
+ "learning_rate": 6.5156985871271585e-06,
2431
+ "loss": 1.8374,
2432
+ "step": 31100
2433
+ },
2434
+ {
2435
+ "epoch": 46.09,
2436
+ "learning_rate": 6.280219780219779e-06,
2437
+ "loss": 1.8395,
2438
+ "step": 31200
2439
+ },
2440
+ {
2441
+ "epoch": 46.23,
2442
+ "learning_rate": 6.044740973312402e-06,
2443
+ "loss": 1.8377,
2444
+ "step": 31300
2445
+ },
2446
+ {
2447
+ "epoch": 46.38,
2448
+ "learning_rate": 5.809262166405023e-06,
2449
+ "loss": 1.87,
2450
+ "step": 31400
2451
+ },
2452
+ {
2453
+ "epoch": 46.53,
2454
+ "learning_rate": 5.573783359497644e-06,
2455
+ "loss": 1.877,
2456
+ "step": 31500
2457
+ },
2458
+ {
2459
+ "epoch": 46.53,
2460
+ "eval_loss": 0.8848925828933716,
2461
+ "eval_runtime": 116.1465,
2462
+ "eval_samples_per_second": 17.4,
2463
+ "eval_steps_per_second": 2.178,
2464
+ "eval_wer": 2.0588817417120238,
2465
+ "step": 31500
2466
+ },
2467
+ {
2468
+ "epoch": 46.68,
2469
+ "learning_rate": 5.3383045525902665e-06,
2470
+ "loss": 1.8256,
2471
+ "step": 31600
2472
+ },
2473
+ {
2474
+ "epoch": 46.82,
2475
+ "learning_rate": 5.1028257456828875e-06,
2476
+ "loss": 1.8317,
2477
+ "step": 31700
2478
+ },
2479
+ {
2480
+ "epoch": 46.97,
2481
+ "learning_rate": 4.86734693877551e-06,
2482
+ "loss": 1.8579,
2483
+ "step": 31800
2484
+ },
2485
+ {
2486
+ "epoch": 47.12,
2487
+ "learning_rate": 4.631868131868132e-06,
2488
+ "loss": 1.839,
2489
+ "step": 31900
2490
+ },
2491
+ {
2492
+ "epoch": 47.27,
2493
+ "learning_rate": 4.396389324960754e-06,
2494
+ "loss": 1.8587,
2495
+ "step": 32000
2496
+ },
2497
+ {
2498
+ "epoch": 47.27,
2499
+ "eval_loss": 0.8843359351158142,
2500
+ "eval_runtime": 116.5866,
2501
+ "eval_samples_per_second": 17.335,
2502
+ "eval_steps_per_second": 2.17,
2503
+ "eval_wer": 2.045027214250371,
2504
+ "step": 32000
2505
+ },
2506
+ {
2507
+ "epoch": 47.41,
2508
+ "learning_rate": 4.160910518053375e-06,
2509
+ "loss": 1.8419,
2510
+ "step": 32100
2511
+ },
2512
+ {
2513
+ "epoch": 47.56,
2514
+ "learning_rate": 3.925431711145996e-06,
2515
+ "loss": 1.8639,
2516
+ "step": 32200
2517
+ },
2518
+ {
2519
+ "epoch": 47.71,
2520
+ "learning_rate": 3.6899529042386186e-06,
2521
+ "loss": 1.8395,
2522
+ "step": 32300
2523
+ },
2524
+ {
2525
+ "epoch": 47.86,
2526
+ "learning_rate": 3.45447409733124e-06,
2527
+ "loss": 1.8369,
2528
+ "step": 32400
2529
+ },
2530
+ {
2531
+ "epoch": 48.01,
2532
+ "learning_rate": 3.2189952904238617e-06,
2533
+ "loss": 1.8236,
2534
+ "step": 32500
2535
+ },
2536
+ {
2537
+ "epoch": 48.01,
2538
+ "eval_loss": 0.8810222148895264,
2539
+ "eval_runtime": 115.817,
2540
+ "eval_samples_per_second": 17.45,
2541
+ "eval_steps_per_second": 2.184,
2542
+ "eval_wer": 2.0554181098466104,
2543
+ "step": 32500
2544
+ },
2545
+ {
2546
+ "epoch": 48.15,
2547
+ "learning_rate": 2.9835164835164835e-06,
2548
+ "loss": 1.8468,
2549
+ "step": 32600
2550
+ },
2551
+ {
2552
+ "epoch": 48.3,
2553
+ "learning_rate": 2.7503924646781788e-06,
2554
+ "loss": 1.8326,
2555
+ "step": 32700
2556
+ },
2557
+ {
2558
+ "epoch": 48.45,
2559
+ "learning_rate": 2.5149136577708006e-06,
2560
+ "loss": 1.8279,
2561
+ "step": 32800
2562
+ },
2563
+ {
2564
+ "epoch": 48.6,
2565
+ "learning_rate": 2.2794348508634223e-06,
2566
+ "loss": 1.8324,
2567
+ "step": 32900
2568
+ },
2569
+ {
2570
+ "epoch": 48.74,
2571
+ "learning_rate": 2.043956043956044e-06,
2572
+ "loss": 1.8392,
2573
+ "step": 33000
2574
+ },
2575
+ {
2576
+ "epoch": 48.74,
2577
+ "eval_loss": 0.8820456266403198,
2578
+ "eval_runtime": 115.6891,
2579
+ "eval_samples_per_second": 17.469,
2580
+ "eval_steps_per_second": 2.187,
2581
+ "eval_wer": 2.0573973280554183,
2582
+ "step": 33000
2583
+ },
2584
+ {
2585
+ "epoch": 48.89,
2586
+ "learning_rate": 1.8084772370486653e-06,
2587
+ "loss": 1.8363,
2588
+ "step": 33100
2589
+ },
2590
+ {
2591
+ "epoch": 49.04,
2592
+ "learning_rate": 1.572998430141287e-06,
2593
+ "loss": 1.7996,
2594
+ "step": 33200
2595
+ },
2596
+ {
2597
+ "epoch": 49.19,
2598
+ "learning_rate": 1.3375196232339088e-06,
2599
+ "loss": 1.8113,
2600
+ "step": 33300
2601
+ },
2602
+ {
2603
+ "epoch": 49.34,
2604
+ "learning_rate": 1.1020408163265304e-06,
2605
+ "loss": 1.8428,
2606
+ "step": 33400
2607
+ },
2608
+ {
2609
+ "epoch": 49.48,
2610
+ "learning_rate": 8.665620094191522e-07,
2611
+ "loss": 1.8428,
2612
+ "step": 33500
2613
+ },
2614
+ {
2615
+ "epoch": 49.48,
2616
+ "eval_loss": 0.8815611600875854,
2617
+ "eval_runtime": 117.0058,
2618
+ "eval_samples_per_second": 17.273,
2619
+ "eval_steps_per_second": 2.162,
2620
+ "eval_wer": 2.066798614547254,
2621
+ "step": 33500
2622
+ },
2623
+ {
2624
+ "epoch": 49.63,
2625
+ "learning_rate": 6.310832025117738e-07,
2626
+ "loss": 1.8284,
2627
+ "step": 33600
2628
+ },
2629
+ {
2630
+ "epoch": 49.78,
2631
+ "learning_rate": 3.9560439560439557e-07,
2632
+ "loss": 1.8226,
2633
+ "step": 33700
2634
+ },
2635
+ {
2636
+ "epoch": 49.93,
2637
+ "learning_rate": 1.6012558869701725e-07,
2638
+ "loss": 1.8287,
2639
+ "step": 33800
2640
+ },
2641
+ {
2642
+ "epoch": 50.0,
2643
+ "step": 33850,
2644
+ "total_flos": 1.54029172542989e+20,
2645
+ "train_loss": 4.34445102640938,
2646
+ "train_runtime": 69888.62,
2647
+ "train_samples_per_second": 15.505,
2648
+ "train_steps_per_second": 0.484
2649
+ }
2650
+ ],
2651
+ "max_steps": 33850,
2652
+ "num_train_epochs": 50,
2653
+ "total_flos": 1.54029172542989e+20,
2654
+ "trial_name": null,
2655
+ "trial_params": null
2656
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:865e00e6503ba9a84aae151f6fca7048aa8118080935253305e52fcf3ebbf980
3
+ size 3055
vocab.json ADDED
@@ -0,0 +1 @@
 
1
+ {"?": 1, "c": 2, "·": 3, "×": 4, "̃": 5, "̌": 6, "ε": 7, "λ": 8, "μ": 9, "и": 10, "т": 11, "—": 12, "‘": 13, "’": 14, "“": 15, "”": 16, "…": 17, "‧": 18, "─": 19, "□": 20, "、": 21, "。": 22, "〈": 23, "〉": 24, "《": 25, "》": 26, "「": 27, "」": 28, "『": 29, "』": 30, "ア": 31, "オ": 32, "カ": 33, "チ": 34, "ド": 35, "ベ": 36, "ャ": 37, "ヤ": 38, "ン": 39, "・": 40, "ー": 41, "ㄟ": 42, "䲟": 43, "䴓": 44, "䴕": 45, "一": 46, "丁": 47, "七": 48, "万": 49, "丈": 50, "三": 51, "上": 52, "下": 53, "不": 54, "与": 55, "丐": 56, "丑": 57, "专": 58, "且": 59, "丕": 60, "世": 61, "丘": 62, "丙": 63, "业": 64, "丛": 65, "东": 66, "丝": 67, "丞": 68, "丟": 69, "丢": 70, "两": 71, "严": 72, "丧": 73, "个": 74, "丫": 75, "中": 76, "丰": 77, "串": 78, "临": 79, "丸": 80, "丹": 81, "为": 82, "主": 83, "丽": 84, "举": 85, "乂": 86, "乃": 87, "久": 88, "么": 89, "义": 90, "之": 91, "乌": 92, "乍": 93, "乎": 94, "乏": 95, "乐": 96, "乒": 97, "乓": 98, "乔": 99, "乖": 100, "乘": 101, "乙": 102, "九": 103, "乞": 104, "也": 105, "习": 106, "乡": 107, "书": 108, "买": 109, "乱": 110, "乳": 111, "乾": 112, "了": 113, "予": 114, "争": 115, "事": 116, "二": 117, "于": 118, "亏": 119, "云": 120, "互": 121, "五": 122, "井": 123, "亚": 124, "些": 125, "亡": 126, "交": 127, "亥": 128, "亦": 129, "产": 130, "亨": 131, "亩": 132, "享": 133, "京": 134, "亭": 135, "亮": 136, "亲": 137, "亳": 138, "人": 139, "亿": 140, "什": 141, "仁": 142, "仄": 143, "仅": 144, "仆": 145, "仇": 146, "今": 147, "介": 148, "仍": 149, "从": 150, "仑": 151, "仓": 152, "仔": 153, "仕": 154, "他": 155, "仗": 156, "付": 157, "仙": 158, "仞": 159, "代": 160, "令": 161, "以": 162, "仪": 163, "们": 164, "仰": 165, "仲": 166, "件": 167, "价": 168, "任": 169, "份": 170, "仿": 171, "企": 172, "伉": 173, "伊": 174, "伍": 175, "伎": 176, "伏": 177, "伐": 178, "休": 179, "众": 180, "优": 181, "伙": 182, "会": 183, "伛": 184, "伞": 185, "伟": 186, "传": 187, "伤": 188, "伦": 189, "伪": 190, "伫": 191, "伯": 192, "估": 193, "伴": 194, "伸": 195, "伺": 196, "似": 197, "伽": 198, "佃": 199, "但": 200, "位": 201, "低": 202, "住": 203, "佐": 204, "佑": 205, "体": 206, "何": 207, "佗": 208, "佘": 209, "余": 210, "佚": 211, "佛": 212, "作": 213, "佟": 214, "你": 215, "佣": 216, "佤": 217, "佥": 218, "佩": 219, "佬": 220, "佰": 221, "佳": 222, "使": 223, "侂": 224, "侃": 225, "侄": 226, "侈": 227, "例": 228, "侍": 229, "侏": 230, "侗": 231, "供": 232, "依": 233, "侠": 234, "侣": 235, "侦": 236, "侧": 237, "侨": 238, "侪": 239, "侬": 240, "侮": 241, "侯": 242, "侴": 243, "侵": 244, "便": 245, "促": 246, "俄": 247, "俊": 248, "俐": 249, "俗": 250, "俘": 251, "保": 252, "俞": 253, "信": 254, "俣": 255, "俨": 256, "俩": 257, "俭": 258, "修": 259, "俯": 260, "俱": 261, "俳": 262, "俵": 263, "俸": 264, "倍": 265, "倒": 266, "倘": 267, "候": 268, "倚": 269, "借": 270, "倡": 271, "倦": 272, "倪": 273, "倭": 274, "债": 275, "倻": 276, "值": 277, "倾": 278, "偃": 279, "假": 280, "偈": 281, "偏": 282, "偕": 283, "做": 284, "停": 285, "健": 286, "偰": 287, "偲": 288, "偶": 289, "偷": 290, "偿": 291, "傅": 292, "傍": 293, "傕": 294, "傣": 295, "储": 296, "催": 297, "傲": 298, "傻": 299, "像": 300, "僖": 301, "僚": 302, "僧": 303, "僵": 304, "儆": 305, "儋": 306, "儒": 307, "儿": 308, "兀": 309, "允": 310, "元": 311, "兄": 312, "充": 313, "兆": 314, "先": 315, "光": 316, "克": 317, "免": 318, "兑": 319, "兔": 320, "兖": 321, "党": 322, "兜": 323, "入": 324, "全": 325, "八": 326, "公": 327, "六": 328, "兮": 329, "兰": 330, "共": 331, "关": 332, "兴": 333, "兵": 334, "其": 335, "具": 336, "典": 337, "兹": 338, "养": 339, "兼": 340, "兽": 341, "冀": 342, "内": 343, "冈": 344, "冉": 345, "册": 346, "再": 347, "冒": 348, "冕": 349, "冗": 350, "写": 351, "军": 352, "农": 353, "冠": 354, "冢": 355, "冤": 356, "冥": 357, "冬": 358, "冯": 359, "冰": 360, "冱": 361, "冲": 362, "决": 363, "况": 364, "冶": 365, "冷": 366, "冻": 367, "冼": 368, "净": 369, "凃": 370, "凄": 371, "准": 372, "凉": 373, "凋": 374, "凌": 375, "减": 376, "凑": 377, "凝": 378, "几": 379, "凡": 380, "凤": 381, "凭": 382, "凯": 383, "凰": 384, "凳": 385, "凶": 386, "凸": 387, "凹": 388, "出": 389, "击": 390, "函": 391, "凿": 392, "刀": 393, "刃": 394, "分": 395, "切": 396, "刈": 397, "刊": 398, "刍": 399, "刑": 400, "划": 401, "列": 402, "刘": 403, "则": 404, "刚": 405, "创": 406, "初": 407, "删": 408, "判": 409, "別": 410, "利": 411, "别": 412, "刮": 413, "到": 414, "制": 415, "刷": 416, "券": 417, "刹": 418, "刺": 419, "刻": 420, "剀": 421, "剂": 422, "削": 423, "剌": 424, "前": 425, "剑": 426, "剖": 427, "剥": 428, "剧": 429, "剩": 430, "剪": 431, "副": 432, "割": 433, "剿": 434, "劈": 435, "力": 436, "劝": 437, "办": 438, "功": 439, "加": 440, "务": 441, "劣": 442, "动": 443, "助": 444, "努": 445, "劫": 446, "劭": 447, "励": 448, "劲": 449, "劳": 450, "劾": 451, "势": 452, "勃": 453, "勇": 454, "勉": 455, "勋": 456, "勍": 457, "勐": 458, "勒": 459, "勖": 460, "勘": 461, "募": 462, "勤": 463, "勺": 464, "勾": 465, "勿": 466, "匀": 467, "包": 468, "匆": 469, "匈": 470, "匍": 471, "匐": 472, "匕": 473, "化": 474, "北": 475, "匙": 476, "匝": 477, "匠": 478, "匡": 479, "匣": 480, "匪": 481, "匮": 482, "匹": 483, "区": 484, "医": 485, "匾": 486, "匿": 487, "十": 488, "千": 489, "升": 490, "午": 491, "卉": 492, "半": 493, "华": 494, "协": 495, "卑": 496, "卒": 497, "卓": 498, "单": 499, "卖": 500, "南": 501, "博": 502, "卜": 503, "卞": 504, "占": 505, "卡": 506, "卢": 507, "卤": 508, "卦": 509, "卧": 510, "卫": 511, "卯": 512, "印": 513, "危": 514, "即": 515, "却": 516, "卵": 517, "卷": 518, "卸": 519, "卿": 520, "厂": 521, "厄": 522, "厅": 523, "历": 524, "厉": 525, "压": 526, "厌": 527, "厍": 528, "厕": 529, "厘": 530, "厚": 531, "厝": 532, "原": 533, "厢": 534, "厥": 535, "厦": 536, "厨": 537, "厩": 538, "厮": 539, "去": 540, "县": 541, "参": 542, "叅": 543, "又": 544, "叉": 545, "及": 546, "友": 547, "双": 548, "反": 549, "发": 550, "叔": 551, "取": 552, "受": 553, "变": 554, "叙": 555, "叛": 556, "叟": 557, "叠": 558, "口": 559, "古": 560, "句": 561, "另": 562, "叩": 563, "只": 564, "叫": 565, "召": 566, "叭": 567, "可": 568, "台": 569, "史": 570, "右": 571, "叶": 572, "号": 573, "司": 574, "叹": 575, "叻": 576, "叼": 577, "吁": 578, "吃": 579, "各": 580, "合": 581, "吉": 582, "吊": 583, "同": 584, "名": 585, "后": 586, "吏": 587, "吐": 588, "向": 589, "吓": 590, "吕": 591, "吖": 592, "吗": 593, "君": 594, "吞": 595, "吟": 596, "吠": 597, "否": 598, "吧": 599, "吨": 600, "含": 601, "听": 602, "启": 603, "吴": 604, "吵": 605, "吸": 606, "吹": 607, "吻": 608, "吼": 609, "吾": 610, "呀": 611, "呆": 612, "呈": 613, "告": 614, "呋": 615, "呐": 616, "呔": 617, "呗": 618, "员": 619, "呢": 620, "呤": 621, "周": 622, "味": 623, "呵": 624, "呻": 625, "呼": 626, "命": 627, "咀": 628, "和": 629, "咎": 630, "咏": 631, "咒": 632, "咕": 633, "咖": 634, "咝": 635, "咨": 636, "咪": 637, "咬": 638, "咳": 639, "咸": 640, "咽": 641, "哀": 642, "品": 643, "哄": 644, "哇": 645, "哈": 646, "哉": 647, "响": 648, "哑": 649, "哔": 650, "哥": 651, "哨": 652, "哩": 653, "哪": 654, "哭": 655, "哮": 656, "哲": 657, "哺": 658, "哼": 659, "唁": 660, "唆": 661, "唇": 662, "唐": 663, "唑": 664, "唔": 665, "唤": 666, "唬": 667, "售": 668, "唯": 669, "唱": 670, "唸": 671, "唾": 672, "啄": 673, "商": 674, "啉": 675, "啊": 676, "啡": 677, "啤": 678, "啥": 679, "啦": 680, "啧": 681, "啰": 682, "啶": 683, "啸": 684, "喀": 685, "喃": 686, "善": 687, "喇": 688, "喉": 689, "喊": 690, "喔": 691, "喘": 692, "喙": 693, "喜": 694, "喝": 695, "喧": 696, "喱": 697, "喵": 698, "喷": 699, "喹": 700, "喻": 701, "喾": 702, "嗅": 703, "嗓": 704, "嗜": 705, "嗣": 706, "嗽": 707, "嘈": 708, "嘉": 709, "嘌": 710, "嘎": 711, "嘘": 712, "嘛": 713, "嘟": 714, "嘧": 715, "嘲": 716, "嘴": 717, "嘻": 718, "噌": 719, "噜": 720, "器": 721, "噩": 722, "噪": 723, "噬": 724, "噶": 725, "嚏": 726, "嚓": 727, "嚣": 728, "囊": 729, "囚": 730, "四": 731, "回": 732, "因": 733, "团": 734, "园": 735, "困": 736, "围": 737, "固": 738, "国": 739, "图": 740, "圃": 741, "圆": 742, "圈": 743, "圉": 744, "圜": 745, "土": 746, "圣": 747, "在": 748, "圩": 749, "圪": 750, "圭": 751, "地": 752, "圳": 753, "圹": 754, "场": 755, "圻": 756, "圾": 757, "址": 758, "坂": 759, "均": 760, "坊": 761, "坍": 762, "坎": 763, "坏": 764, "坐": 765, "坑": 766, "块": 767, "坚": 768, "坛": 769, "坜": 770, "坝": 771, "坞": 772, "坟": 773, "坠": 774, "坡": 775, "坤": 776, "坦": 777, "坨": 778, "坪": 779, "坳": 780, "坻": 781, "垂": 782, "垃": 783, "垄": 784, "型": 785, "垌": 786, "垒": 787, "垢": 788, "垣": 789, "垦": 790, "垩": 791, "垫": 792, "埃": 793, "埇": 794, "埈": 795, "埋": 796, "城": 797, "埔": 798, "埕": 799, "埗": 800, "域": 801, "埠": 802, "培": 803, "基": 804, "堂": 805, "堆": 806, "堇": 807, "堍": 808, "堎": 809, "堕": 810, "堡": 811, "堤": 812, "堪": 813, "堰": 814, "堵": 815, "堺": 816, "塌": 817, "塑": 818, "塔": 819, "塘": 820, "塞": 821, "填": 822, "塬": 823, "塾": 824, "墀": 825, "境": 826, "墅": 827, "墓": 828, "墙": 829, "增": 830, "墟": 831, "墨": 832, "墩": 833, "壁": 834, "壕": 835, "壤": 836, "士": 837, "壬": 838, "壮": 839, "声": 840, "壳": 841, "壶": 842, "壸": 843, "壹": 844, "壽": 845, "处": 846, "备": 847, "复": 848, "夏": 849, "夕": 850, "外": 851, "夙": 852, "多": 853, "夜": 854, "够": 855, "夥": 856, "大": 857, "天": 858, "太": 859, "夫": 860, "夭": 861, "央": 862, "失": 863, "头": 864, "夷": 865, "夸": 866, "夹": 867, "夺": 868, "奂": 869, "奄": 870, "奇": 871, "奈": 872, "奉": 873, "奋": 874, "奎": 875, "奏": 876, "契": 877, "奔": 878, "奕": 879, "奖": 880, "套": 881, "奘": 882, "奚": 883, "奠": 884, "奢": 885, "奣": 886, "奥": 887, "女": 888, "奴": 889, "奶": 890, "奸": 891, "她": 892, "好": 893, "如": 894, "妃": 895, "妄": 896, "妆": 897, "妇": 898, "妈": 899, "妊": 900, "妍": 901, "妒": 902, "妓": 903, "妖": 904, "妙": 905, "妡": 906, "妤": 907, "妥": 908, "妨": 909, "妫": 910, "妮": 911, "妲": 912, "妳": 913, "妹": 914, "妻": 915, "妾": 916, "姆": 917, "姊": 918, "始": 919, "姐": 920, "姑": 921, "姒": 922, "姓": 923, "委": 924, "姚": 925, "姜": 926, "姝": 927, "姨": 928, "姬": 929, "姮": 930, "姻": 931, "姿": 932, "威": 933, "娃": 934, "娄": 935, "娅": 936, "娆": 937, "娇": 938, "娘": 939, "娜": 940, "娟": 941, "娠": 942, "娣": 943, "娥": 944, "娱": 945, "娴": 946, "娶": 947, "娼": 948, "婆": 949, "婉": 950, "婕": 951, "婚": 952, "婢": 953, "婪": 954, "婴": 955, "婵": 956, "婷": 957, "婺": 958, "婿": 959, "媒": 960, "媛": 961, "媲": 962, "媳": 963, "嫁": 964, "嫉": 965, "嫌": 966, "嫔": 967, "嫖": 968, "嫡": 969, "嫣": 970, "嫩": 971, "嬤": 972, "嬴": 973, "嬷": 974, "孀": 975, "子": 976, "孔": 977, "孕": 978, "孖": 979, "字": 980, "存": 981, "孙": 982, "孚": 983, "孛": 984, "孜": 985, "孝": 986, "孟": 987, "孢": 988, "季": 989, "孤": 990, "学": 991, "孩": 992, "孪": 993, "孵": 994, "孺": 995, "宁": 996, "它": 997, "宅": 998, "宇": 999, "守": 1000, "安": 1001, "宋": 1002, "完": 1003, "宏": 1004, "宕": 1005, "宗": 1006, "官": 1007, "宙": 1008, "定": 1009, "宛": 1010, "宜": 1011, "宝": 1012, "实": 1013, "宠": 1014, "审": 1015, "客": 1016, "宣": 1017, "室": 1018, "宥": 1019, "宦": 1020, "宪": 1021, "宫": 1022, "宰": 1023, "害": 1024, "宴": 1025, "家": 1026, "宸": 1027, "容": 1028, "宽": 1029, "宾": 1030, "宿": 1031, "寀": 1032, "寂": 1033, "寄": 1034, "寅": 1035, "密": 1036, "寇": 1037, "富": 1038, "寒": 1039, "寓": 1040, "寝": 1041, "寞": 1042, "察": 1043, "寡": 1044, "寨": 1045, "寮": 1046, "寰": 1047, "寸": 1048, "对": 1049, "寺": 1050, "寻": 1051, "导": 1052, "寿": 1053, "封": 1054, "専": 1055, "射": 1056, "将": 1057, "尉": 1058, "尊": 1059, "小": 1060, "少": 1061, "尔": 1062, "尕": 1063, "尖": 1064, "尘": 1065, "尚": 1066, "尝": 1067, "尤": 1068, "尧": 1069, "尨": 1070, "就": 1071, "尸": 1072, "尹": 1073, "尺": 1074, "尻": 1075, "尼": 1076, "尽": 1077, "尾": 1078, "尿": 1079, "局": 1080, "层": 1081, "居": 1082, "屈": 1083, "届": 1084, "屋": 1085, "屎": 1086, "屏": 1087, "屐": 1088, "屑": 1089, "展": 1090, "属": 1091, "屠": 1092, "屡": 1093, "履": 1094, "屯": 1095, "山": 1096, "屹": 1097, "屿": 1098, "岁": 1099, "岂": 1100, "岈": 1101, "岐": 1102, "岑": 1103, "岔": 1104, "岗": 1105, "岚": 1106, "岛": 1107, "岩": 1108, "岫": 1109, "岬": 1110, "岭": 1111, "岱": 1112, "岳": 1113, "岷": 1114, "岸": 1115, "峄": 1116, "峒": 1117, "峙": 1118, "峡": 1119, "峤": 1120, "峥": 1121, "峨": 1122, "峩": 1123, "峪": 1124, "峭": 1125, "峯": 1126, "峰": 1127, "峻": 1128, "崁": 1129, "崂": 1130, "崇": 1131, "崎": 1132, "崔": 1133, "崖": 1134, "崛": 1135, "崞": 1136, "崧": 1137, "崩": 1138, "崭": 1139, "崴": 1140, "嵋": 1141, "嵌": 1142, "嵖": 1143, "嵗": 1144, "嵩": 1145, "嵯": 1146, "嵴": 1147, "嶷": 1148, "巅": 1149, "川": 1150, "州": 1151, "巡": 1152, "巢": 1153, "工": 1154, "左": 1155, "巧": 1156, "巨": 1157, "巩": 1158, "巫": 1159, "差": 1160, "巯": 1161, "己": 1162, "已": 1163, "巳": 1164, "巴": 1165, "巷": 1166, "巽": 1167, "巾": 1168, "币": 1169, "市": 1170, "布": 1171, "帅": 1172, "帆": 1173, "师": 1174, "希": 1175, "帐": 1176, "帕": 1177, "帖": 1178, "帘": 1179, "帚": 1180, "帛": 1181, "帜": 1182, "帝": 1183, "带": 1184, "帧": 1185, "席": 1186, "帮": 1187, "帷": 1188, "常": 1189, "帽": 1190, "幂": 1191, "幅": 1192, "幌": 1193, "幔": 1194, "幕": 1195, "幡": 1196, "幢": 1197, "干": 1198, "平": 1199, "年": 1200, "并": 1201, "幸": 1202, "幻": 1203, "幼": 1204, "幽": 1205, "广": 1206, "庄": 1207, "庆": 1208, "庇": 1209, "床": 1210, "序": 1211, "庐": 1212, "库": 1213, "应": 1214, "底": 1215, "店": 1216, "庙": 1217, "庚": 1218, "府": 1219, "庞": 1220, "废": 1221, "度": 1222, "座": 1223, "庭": 1224, "庵": 1225, "庶": 1226, "康": 1227, "庸": 1228, "庹": 1229, "庾": 1230, "廆": 1231, "廉": 1232, "廊": 1233, "廓": 1234, "廖": 1235, "廪": 1236, "延": 1237, "廷": 1238, "建": 1239, "廿": 1240, "开": 1241, "异": 1242, "弃": 1243, "弄": 1244, "弈": 1245, "弊": 1246, "式": 1247, "弓": 1248, "引": 1249, "弗": 1250, "弘": 1251, "弟": 1252, "张": 1253, "弢": 1254, "弥": 1255, "弦": 1256, "弧": 1257, "弯": 1258, "弱": 1259, "弹": 1260, "强": 1261, "弼": 1262, "归": 1263, "当": 1264, "录": 1265, "彗": 1266, "彝": 1267, "形": 1268, "彤": 1269, "彦": 1270, "彧": 1271, "彩": 1272, "彪": 1273, "彬": 1274, "彭": 1275, "彰": 1276, "影": 1277, "役": 1278, "彻": 1279, "彼": 1280, "往": 1281, "征": 1282, "径": 1283, "待": 1284, "徇": 1285, "很": 1286, "徊": 1287, "律": 1288, "徐": 1289, "徒": 1290, "得": 1291, "徘": 1292, "徙": 1293, "御": 1294, "循": 1295, "微": 1296, "德": 1297, "徽": 1298, "心": 1299, "必": 1300, "忆": 1301, "忌": 1302, "忍": 1303, "忒": 1304, "志": 1305, "忘": 1306, "忙": 1307, "忠": 1308, "忤": 1309, "忧": 1310, "快": 1311, "忱": 1312, "念": 1313, "忻": 1314, "忽": 1315, "怀": 1316, "态": 1317, "怎": 1318, "怒": 1319, "怕": 1320, "怖": 1321, "怜": 1322, "思": 1323, "怡": 1324, "急": 1325, "性": 1326, "怨": 1327, "怪": 1328, "怵": 1329, "总": 1330, "恂": 1331, "恋": 1332, "恐": 1333, "恒": 1334, "恕": 1335, "恢": 1336, "恨": 1337, "恩": 1338, "恪": 1339, "恬": 1340, "恭": 1341, "息": 1342, "恰": 1343, "恶": 1344, "恺": 1345, "恼": 1346, "恽": 1347, "悄": 1348, "悉": 1349, "悌": 1350, "悍": 1351, "悔": 1352, "悖": 1353, "悚": 1354, "悟": 1355, "悠": 1356, "患": 1357, "悦": 1358, "您": 1359, "悫": 1360, "悬": 1361, "悲": 1362, "悼": 1363, "情": 1364, "惊": 1365, "惑": 1366, "惕": 1367, "惘": 1368, "惜": 1369, "惟": 1370, "惠": 1371, "惧": 1372, "惨": 1373, "惩": 1374, "惪": 1375, "惬": 1376, "惭": 1377, "惯": 1378, "惰": 1379, "想": 1380, "惹": 1381, "惺": 1382, "愁": 1383, "愈": 1384, "愉": 1385, "意": 1386, "愔": 1387, "愕": 1388, "愚": 1389, "感": 1390, "愤": 1391, "愧": 1392, "愿": 1393, "慈": 1394, "慌": 1395, "慎": 1396, "慑": 1397, "慕": 1398, "慢": 1399, "慧": 1400, "慨": 1401, "慰": 1402, "慷": 1403, "慾": 1404, "憍": 1405, "憧": 1406, "憨": 1407, "憩": 1408, "憬": 1409, "憾": 1410, "懂": 1411, "懈": 1412, "懋": 1413, "懒": 1414, "懔": 1415, "懦": 1416, "懿": 1417, "戈": 1418, "戊": 1419, "戌": 1420, "戍": 1421, "戎": 1422, "戏": 1423, "成": 1424, "我": 1425, "戒": 1426, "或": 1427, "战": 1428, "戚": 1429, "戛": 1430, "戟": 1431, "截": 1432, "戮": 1433, "戴": 1434, "户": 1435, "房": 1436, "所": 1437, "扁": 1438, "扇": 1439, "扈": 1440, "扉": 1441, "手": 1442, "才": 1443, "扎": 1444, "扑": 1445, "打": 1446, "扔": 1447, "托": 1448, "扣": 1449, "执": 1450, "扩": 1451, "扫": 1452, "扬": 1453, "扭": 1454, "扮": 1455, "扯": 1456, "扰": 1457, "扶": 1458, "批": 1459, "找": 1460, "承": 1461, "技": 1462, "抄": 1463, "把": 1464, "抑": 1465, "抒": 1466, "抓": 1467, "投": 1468, "抖": 1469, "抗": 1470, "折": 1471, "抚": 1472, "抛": 1473, "抢": 1474, "护": 1475, "报": 1476, "披": 1477, "抬": 1478, "抱": 1479, "抵": 1480, "抹": 1481, "押": 1482, "抽": 1483, "拂": 1484, "担": 1485, "拆": 1486, "拈": 1487, "拉": 1488, "拌": 1489, "拍": 1490, "拐": 1491, "拒": 1492, "拓": 1493, "拔": 1494, "拖": 1495, "拗": 1496, "拘": 1497, "招": 1498, "拜": 1499, "拟": 1500, "拢": 1501, "拣": 1502, "拥": 1503, "拦": 1504, "拨": 1505, "择": 1506, "括": 1507, "拮": 1508, "拯": 1509, "拱": 1510, "拳": 1511, "拷": 1512, "拼": 1513, "拾": 1514, "拿": 1515, "持": 1516, "挂": 1517, "指": 1518, "按": 1519, "挑": 1520, "挖": 1521, "挚": 1522, "挛": 1523, "挝": 1524, "挞": 1525, "挟": 1526, "挡": 1527, "挤": 1528, "挥": 1529, "挪": 1530, "挫": 1531, "振": 1532, "挹": 1533, "挺": 1534, "挽": 1535, "捆": 1536, "捉": 1537, "捍": 1538, "捏": 1539, "捐": 1540, "捕": 1541, "捞": 1542, "损": 1543, "捡": 1544, "换": 1545, "捣": 1546, "捧": 1547, "据": 1548, "捷": 1549, "掀": 1550, "授": 1551, "掉": 1552, "掌": 1553, "掏": 1554, "排": 1555, "掖": 1556, "掘": 1557, "掛": 1558, "掠": 1559, "探": 1560, "接": 1561, "控": 1562, "推": 1563, "掩": 1564, "措": 1565, "掳": 1566, "掷": 1567, "掸": 1568, "掾": 1569, "揆": 1570, "揉": 1571, "揍": 1572, "描": 1573, "提": 1574, "插": 1575, "揖": 1576, "握": 1577, "揣": 1578, "揭": 1579, "援": 1580, "揷": 1581, "揽": 1582, "搁": 1583, "搅": 1584, "搏": 1585, "搜": 1586, "搞": 1587, "搪": 1588, "搬": 1589, "搭": 1590, "携": 1591, "摄": 1592, "摆": 1593, "摇": 1594, "摊": 1595, "摔": 1596, "摘": 1597, "摧": 1598, "摩": 1599, "摸": 1600, "摹": 1601, "摺": 1602, "撑": 1603, "撒": 1604, "撕": 1605, "撞": 1606, "撤": 1607, "播": 1608, "撮": 1609, "撰": 1610, "撼": 1611, "擂": 1612, "擅": 1613, "操": 1614, "擎": 1615, "擒": 1616, "擢": 1617, "擦": 1618, "攀": 1619, "攒": 1620, "攥": 1621, "支": 1622, "收": 1623, "攸": 1624, "改": 1625, "攻": 1626, "放": 1627, "政": 1628, "故": 1629, "效": 1630, "敌": 1631, "敏": 1632, "救": 1633, "敕": 1634, "敖": 1635, "教": 1636, "敛": 1637, "敞": 1638, "敢": 1639, "散": 1640, "敦": 1641, "敬": 1642, "数": 1643, "敲": 1644, "整": 1645, "敷": 1646, "文": 1647, "斋": 1648, "斌": 1649, "斐": 1650, "斑": 1651, "斗": 1652, "料": 1653, "斛": 1654, "斜": 1655, "斡": 1656, "斤": 1657, "斥": 1658, "斧": 1659, "斩": 1660, "断": 1661, "斯": 1662, "新": 1663, "方": 1664, "於": 1665, "施": 1666, "旁": 1667, "旅": 1668, "旆": 1669, "旋": 1670, "旌": 1671, "族": 1672, "旗": 1673, "无": 1674, "既": 1675, "日": 1676, "旦": 1677, "旧": 1678, "旨": 1679, "早": 1680, "旬": 1681, "旭": 1682, "旱": 1683, "时": 1684, "旺": 1685, "昀": 1686, "昂": 1687, "昆": 1688, "昇": 1689, "昊": 1690, "昌": 1691, "明": 1692, "昏": 1693, "易": 1694, "昔": 1695, "昕": 1696, "昙": 1697, "星": 1698, "映": 1699, "春": 1700, "昧": 1701, "昨": 1702, "昭": 1703, "是": 1704, "昴": 1705, "昵": 1706, "昶": 1707, "昼": 1708, "显": 1709, "晁": 1710, "晃": 1711, "晊": 1712, "晋": 1713, "晏": 1714, "晒": 1715, "晓": 1716, "晔": 1717, "晕": 1718, "晖": 1719, "晚": 1720, "晟": 1721, "晤": 1722, "晦": 1723, "晨": 1724, "普": 1725, "景": 1726, "晰": 1727, "晴": 1728, "晶": 1729, "智": 1730, "暂": 1731, "暄": 1732, "暅": 1733, "暎": 1734, "暑": 1735, "暖": 1736, "暗": 1737, "暨": 1738, "暮": 1739, "暴": 1740, "暹": 1741, "曙": 1742, "曜": 1743, "曝": 1744, "曦": 1745, "曰": 1746, "曲": 1747, "曳": 1748, "更": 1749, "曷": 1750, "曹": 1751, "曼": 1752, "曾": 1753, "替": 1754, "最": 1755, "月": 1756, "有": 1757, "朋": 1758, "服": 1759, "朔": 1760, "朕": 1761, "朗": 1762, "望": 1763, "朝": 1764, "期": 1765, "木": 1766, "未": 1767, "末": 1768, "本": 1769, "札": 1770, "术": 1771, "朱": 1772, "朴": 1773, "朵": 1774, "机": 1775, "朽": 1776, "杀": 1777, "杂": 1778, "权": 1779, "杆": 1780, "杉": 1781, "李": 1782, "杏": 1783, "材": 1784, "村": 1785, "杓": 1786, "杖": 1787, "杙": 1788, "杜": 1789, "束": 1790, "杠": 1791, "条": 1792, "来": 1793, "杨": 1794, "杭": 1795, "杯": 1796, "杰": 1797, "杻": 1798, "松": 1799, "板": 1800, "极": 1801, "构": 1802, "枋": 1803, "析": 1804, "枕": 1805, "林": 1806, "枚": 1807, "果": 1808, "枝": 1809, "枞": 1810, "枢": 1811, "枣": 1812, "枨": 1813, "枪": 1814, "枫": 1815, "枭": 1816, "枯": 1817, "枳": 1818, "架": 1819, "柃": 1820, "柄": 1821, "柏": 1822, "某": 1823, "柑": 1824, "染": 1825, "柔": 1826, "柚": 1827, "柜": 1828, "柝": 1829, "柞": 1830, "柠": 1831, "查": 1832, "柩": 1833, "柬": 1834, "柯": 1835, "柰": 1836, "柱": 1837, "柳": 1838, "柴": 1839, "査": 1840, "柽": 1841, "柿": 1842, "栀": 1843, "栃": 1844, "栅": 1845, "标": 1846, "栈": 1847, "栉": 1848, "栋": 1849, "栎": 1850, "栏": 1851, "树": 1852, "栓": 1853, "栖": 1854, "栗": 1855, "校": 1856, "栢": 1857, "栩": 1858, "株": 1859, "栲": 1860, "栳": 1861, "样": 1862, "核": 1863, "根": 1864, "栻": 1865, "格": 1866, "栽": 1867, "栾": 1868, "桀": 1869, "桂": 1870, "桃": 1871, "桄": 1872, "桅": 1873, "框": 1874, "案": 1875, "桉": 1876, "桌": 1877, "桐": 1878, "桑": 1879, "桓": 1880, "桔": 1881, "桕": 1882, "桝": 1883, "桡": 1884, "桢": 1885, "档": 1886, "桤": 1887, "桥": 1888, "桦": 1889, "桧": 1890, "桨": 1891, "桩": 1892, "桫": 1893, "桶": 1894, "梁": 1895, "梅": 1896, "梓": 1897, "梗": 1898, "梢": 1899, "梣": 1900, "梦": 1901, "梧": 1902, "梨": 1903, "梭": 1904, "梯": 1905, "械": 1906, "梳": 1907, "梵": 1908, "梾": 1909, "检": 1910, "棁": 1911, "棉": 1912, "棋": 1913, "棍": 1914, "棒": 1915, "棕": 1916, "棘": 1917, "棚": 1918, "棠": 1919, "棣": 1920, "棨": 1921, "森": 1922, "棱": 1923, "棵": 1924, "棹": 1925, "棺": 1926, "棻": 1927, "椅": 1928, "椋": 1929, "植": 1930, "椎": 1931, "椒": 1932, "検": 1933, "椤": 1934, "椭": 1935, "椰": 1936, "椴": 1937, "椹": 1938, "椿": 1939, "楔": 1940, "楚": 1941, "楝": 1942, "楞": 1943, "楠": 1944, "楣": 1945, "楦": 1946, "楫": 1947, "楮": 1948, "楯": 1949, "楷": 1950, "楸": 1951, "楹": 1952, "楼": 1953, "概": 1954, "榄": 1955, "榆": 1956, "榈": 1957, "榉": 1958, "榔": 1959, "榕": 1960, "榖": 1961, "榙": 1962, "榛": 1963, "榜": 1964, "榧": 1965, "榨": 1966, "榭": 1967, "榴": 1968, "榻": 1969, "槐": 1970, "槚": 1971, "槛": 1972, "槟": 1973, "槭": 1974, "槱": 1975, "槲": 1976, "槽": 1977, "槿": 1978, "樊": 1979, "樟": 1980, "模": 1981, "樨": 1982, "横": 1983, "樱": 1984, "樵": 1985, "樽": 1986, "樾": 1987, "橄": 1988, "橇": 1989, "橐": 1990, "橘": 1991, "橙": 1992, "橡": 1993, "檀": 1994, "檐": 1995, "檗": 1996, "檬": 1997, "欠": 1998, "次": 1999, "欢": 2000, "欣": 2001, "欧": 2002, "欲": 2003, "欸": 2004, "欺": 2005, "款": 2006, "歆": 2007, "歇": 2008, "歉": 2009, "歌": 2010, "歙": 2011, "止": 2012, "正": 2013, "此": 2014, "步": 2015, "武": 2016, "歧": 2017, "歪": 2018, "歹": 2019, "死": 2020, "歼": 2021, "殃": 2022, "殆": 2023, "殉": 2024, "殊": 2025, "残": 2026, "殖": 2027, "殡": 2028, "殴": 2029, "段": 2030, "殷": 2031, "殿": 2032, "毁": 2033, "毅": 2034, "毋": 2035, "母": 2036, "每": 2037, "毒": 2038, "毓": 2039, "比": 2040, "毕": 2041, "毗": 2042, "毙": 2043, "毛": 2044, "毡": 2045, "毫": 2046, "氏": 2047, "民": 2048, "氓": 2049, "气": 2050, "氖": 2051, "氙": 2052, "氛": 2053, "氟": 2054, "���": 2055, "氢": 2056, "氦": 2057, "氧": 2058, "氨": 2059, "氮": 2060, "氯": 2061, "氰": 2062, "水": 2063, "永": 2064, "汀": 2065, "汁": 2066, "求": 2067, "汇": 2068, "汉": 2069, "汐": 2070, "汕": 2071, "汗": 2072, "汛": 2073, "汜": 2074, "汝": 2075, "汞": 2076, "江": 2077, "池": 2078, "污": 2079, "汤": 2080, "汪": 2081, "汰": 2082, "汲": 2083, "汴": 2084, "汶": 2085, "汹": 2086, "汽": 2087, "汾": 2088, "沁": 2089, "沂": 2090, "沃": 2091, "沅": 2092, "沆": 2093, "沈": 2094, "沉": 2095, "沌": 2096, "沐": 2097, "沔": 2098, "沙": 2099, "沛": 2100, "沟": 2101, "没": 2102, "沤": 2103, "沥": 2104, "沦": 2105, "沧": 2106, "沪": 2107, "沫": 2108, "沭": 2109, "沱": 2110, "河": 2111, "沸": 2112, "油": 2113, "治": 2114, "沼": 2115, "沽": 2116, "沾": 2117, "沿": 2118, "泄": 2119, "泉": 2120, "泊": 2121, "泌": 2122, "泓": 2123, "法": 2124, "泗": 2125, "泛": 2126, "泠": 2127, "泡": 2128, "波": 2129, "泣": 2130, "泥": 2131, "注": 2132, "泪": 2133, "泮": 2134, "泯": 2135, "泰": 2136, "泱": 2137, "泳": 2138, "泵": 2139, "泷": 2140, "泸": 2141, "泻": 2142, "泼": 2143, "泽": 2144, "泾": 2145, "洁": 2146, "洄": 2147, "洋": 2148, "洐": 2149, "洒": 2150, "洗": 2151, "洙": 2152, "洛": 2153, "洞": 2154, "洣": 2155, "津": 2156, "洪": 2157, "洮": 2158, "洱": 2159, "洲": 2160, "洵": 2161, "洹": 2162, "活": 2163, "洼": 2164, "洽": 2165, "派": 2166, "流": 2167, "浅": 2168, "浆": 2169, "浇": 2170, "浉": 2171, "浊": 2172, "测": 2173, "济": 2174, "浏": 2175, "浑": 2176, "浒": 2177, "浓": 2178, "浔": 2179, "浙": 2180, "浚": 2181, "浜": 2182, "浞": 2183, "浦": 2184, "浩": 2185, "浪": 2186, "浮": 2187, "浴": 2188, "海": 2189, "浸": 2190, "涂": 2191, "涅": 2192, "消": 2193, "涉": 2194, "涌": 2195, "涓": 2196, "涛": 2197, "涝": 2198, "涞": 2199, "涟": 2200, "涡": 2201, "润": 2202, "涧": 2203, "涨": 2204, "涩": 2205, "涪": 2206, "涯": 2207, "液": 2208, "涵": 2209, "淀": 2210, "淄": 2211, "淅": 2212, "淆": 2213, "淇": 2214, "淋": 2215, "淑": 2216, "淖": 2217, "淘": 2218, "淞": 2219, "淡": 2220, "淤": 2221, "淫": 2222, "淮": 2223, "淯": 2224, "深": 2225, "淳": 2226, "混": 2227, "淹": 2228, "添": 2229, "淼": 2230, "清": 2231, "渊": 2232, "渌": 2233, "渍": 2234, "渎": 2235, "渐": 2236, "渔": 2237, "渗": 2238, "渚": 2239, "渝": 2240, "渠": 2241, "渡": 2242, "渣": 2243, "渤": 2244, "渥": 2245, "温": 2246, "渭": 2247, "港": 2248, "渲": 2249, "渴": 2250, "游": 2251, "湄": 2252, "湍": 2253, "湎": 2254, "湓": 2255, "湖": 2256, "湘": 2257, "湛": 2258, "湜": 2259, "湟": 2260, "湳": 2261, "湾": 2262, "湿": 2263, "溃": 2264, "溅": 2265, "溉": 2266, "源": 2267, "準": 2268, "溞": 2269, "溢": 2270, "溥": 2271, "溧": 2272, "溪": 2273, "溯": 2274, "溲": 2275, "溴": 2276, "溶": 2277, "溺": 2278, "滁": 2279, "滇": 2280, "滋": 2281, "滑": 2282, "滔": 2283, "滕": 2284, "滚": 2285, "滞": 2286, "满": 2287, "滤": 2288, "滥": 2289, "滦": 2290, "滨": 2291, "滩": 2292, "滴": 2293, "漂": 2294, "漆": 2295, "漏": 2296, "演": 2297, "漕": 2298, "漠": 2299, "漩": 2300, "漪": 2301, "漫": 2302, "漳": 2303, "漾": 2304, "潇": 2305, "潍": 2306, "潘": 2307, "潜": 2308, "潞": 2309, "潢": 2310, "潦": 2311, "潭": 2312, "潮": 2313, "潼": 2314, "澄": 2315, "澈": 2316, "澍": 2317, "澎": 2318, "澜": 2319, "澡": 2320, "澥": 2321, "澧": 2322, "澳": 2323, "澶": 2324, "激": 2325, "濂": 2326, "濉": 2327, "濑": 2328, "濒": 2329, "濠": 2330, "濡": 2331, "濮": 2332, "濯": 2333, "瀑": 2334, "瀚": 2335, "瀛": 2336, "瀼": 2337, "灌": 2338, "灏": 2339, "火": 2340, "灭": 2341, "灯": 2342, "灰": 2343, "灵": 2344, "灶": 2345, "灸": 2346, "灼": 2347, "灾": 2348, "灿": 2349, "炀": 2350, "炉": 2351, "炎": 2352, "炒": 2353, "炔": 2354, "炕": 2355, "炖": 2356, "炜": 2357, "炫": 2358, "炬": 2359, "炭": 2360, "炮": 2361, "炳": 2362, "炸": 2363, "点": 2364, "炼": 2365, "炽": 2366, "烁": 2367, "烂": 2368, "烃": 2369, "烈": 2370, "烘": 2371, "烙": 2372, "烛": 2373, "烟": 2374, "烤": 2375, "烦": 2376, "烧": 2377, "烨": 2378, "热": 2379, "烯": 2380, "烷": 2381, "烹": 2382, "烺": 2383, "烽": 2384, "焉": 2385, "焊": 2386, "焕": 2387, "焙": 2388, "焚": 2389, "焦": 2390, "焮": 2391, "焯": 2392, "焰": 2393, "焱": 2394, "然": 2395, "煊": 2396, "煌": 2397, "煎": 2398, "煜": 2399, "煞": 2400, "煤": 2401, "煦": 2402, "照": 2403, "煮": 2404, "煲": 2405, "煽": 2406, "熄": 2407, "熈": 2408, "熊": 2409, "熔": 2410, "熙": 2411, "熟": 2412, "熠": 2413, "熬": 2414, "熹": 2415, "燃": 2416, "燏": 2417, "燕": 2418, "燥": 2419, "燮": 2420, "燹": 2421, "爆": 2422, "爪": 2423, "爬": 2424, "爱": 2425, "爵": 2426, "父": 2427, "爷": 2428, "爸": 2429, "爹": 2430, "爽": 2431, "牁": 2432, "牂": 2433, "片": 2434, "版": 2435, "牌": 2436, "牕": 2437, "牙": 2438, "牛": 2439, "牟": 2440, "牡": 2441, "牢": 2442, "牦": 2443, "牧": 2444, "物": 2445, "牲": 2446, "牵": 2447, "特": 2448, "牺": 2449, "牻": 2450, "犀": 2451, "犁": 2452, "犍": 2453, "犬": 2454, "犯": 2455, "状": 2456, "犷": 2457, "犹": 2458, "狂": 2459, "狄": 2460, "狐": 2461, "狒": 2462, "狗": 2463, "狙": 2464, "狠": 2465, "狡": 2466, "狩": 2467, "独": 2468, "狭": 2469, "狮": 2470, "狯": 2471, "狱": 2472, "狷": 2473, "狸": 2474, "狼": 2475, "猄": 2476, "猎": 2477, "猖": 2478, "猗": 2479, "猛": 2480, "猜": 2481, "猝": 2482, "猩": 2483, "猪": 2484, "猫": 2485, "猬": 2486, "献": 2487, "猴": 2488, "猾": 2489, "猿": 2490, "獐": 2491, "獗": 2492, "獭": 2493, "獴": 2494, "玄": 2495, "率": 2496, "玉": 2497, "王": 2498, "玎": 2499, "玑": 2500, "玖": 2501, "玛": 2502, "玠": 2503, "玩": 2504, "玫": 2505, "玭": 2506, "玮": 2507, "环": 2508, "现": 2509, "玲": 2510, "玶": 2511, "玹": 2512, "玺": 2513, "玻": 2514, "珀": 2515, "珂": 2516, "珈": 2517, "珊": 2518, "珍": 2519, "珐": 2520, "珑": 2521, "珙": 2522, "珞": 2523, "珠": 2524, "珩": 2525, "班": 2526, "珰": 2527, "珲": 2528, "珺": 2529, "球": 2530, "琅": 2531, "理": 2532, "琉": 2533, "琊": 2534, "琏": 2535, "琐": 2536, "琚": 2537, "琛": 2538, "琢": 2539, "琥": 2540, "琦": 2541, "琨": 2542, "琪": 2543, "琬": 2544, "琮": 2545, "琰": 2546, "琳": 2547, "琴": 2548, "琵": 2549, "琶": 2550, "琼": 2551, "瑀": 2552, "瑄": 2553, "瑙": 2554, "瑚": 2555, "瑛": 2556, "瑜": 2557, "瑞": 2558, "瑟": 2559, "瑭": 2560, "瑮": 2561, "瑰": 2562, "瑳": 2563, "瑶": 2564, "瑷": 2565, "瑾": 2566, "璃": 2567, "璆": 2568, "璇": 2569, "璋": 2570, "璎": 2571, "璜": 2572, "璟": 2573, "璧": 2574, "璹": 2575, "瓒": 2576, "瓘": 2577, "瓛": 2578, "瓜": 2579, "瓢": 2580, "瓣": 2581, "瓦": 2582, "瓮": 2583, "瓯": 2584, "瓶": 2585, "瓷": 2586, "甄": 2587, "甘": 2588, "甚": 2589, "甜": 2590, "生": 2591, "甥": 2592, "用": 2593, "甫": 2594, "甬": 2595, "田": 2596, "由": 2597, "甲": 2598, "申": 2599, "电": 2600, "男": 2601, "甸": 2602, "町": 2603, "画": 2604, "甾": 2605, "畅": 2606, "畈": 2607, "畋": 2608, "界": 2609, "畏": 2610, "畔": 2611, "留": 2612, "畜": 2613, "略": 2614, "番": 2615, "畲": 2616, "畴": 2617, "畸": 2618, "畿": 2619, "疃": 2620, "疆": 2621, "疍": 2622, "疏": 2623, "疑": 2624, "疖": 2625, "疗": 2626, "疟": 2627, "疡": 2628, "疣": 2629, "疤": 2630, "疫": 2631, "疮": 2632, "疯": 2633, "疲": 2634, "疹": 2635, "疼": 2636, "疾": 2637, "病": 2638, "症": 2639, "痉": 2640, "痒": 2641, "痕": 2642, "痘": 2643, "痛": 2644, "痢": 2645, "痪": 2646, "痫": 2647, "痴": 2648, "痹": 2649, "痼": 2650, "瘟": 2651, "瘤": 2652, "瘦": 2653, "瘫": 2654, "瘰": 2655, "瘾": 2656, "瘿": 2657, "癌": 2658, "癣": 2659, "癫": 2660, "癸": 2661, "登": 2662, "白": 2663, "百": 2664, "皂": 2665, "的": 2666, "皆": 2667, "皇": 2668, "皋": 2669, "皓": 2670, "皕": 2671, "皖": 2672, "皮": 2673, "皱": 2674, "皿": 2675, "盂": 2676, "盆": 2677, "盈": 2678, "益": 2679, "盏": 2680, "盐": 2681, "监": 2682, "盒": 2683, "盔": 2684, "盖": 2685, "盗": 2686, "盘": 2687, "盛": 2688, "盟": 2689, "盥": 2690, "目": 2691, "盯": 2692, "盱": 2693, "盲": 2694, "直": 2695, "相": 2696, "盼": 2697, "盾": 2698, "省": 2699, "眉": 2700, "看": 2701, "県": 2702, "眙": 2703, "真": 2704, "眠": 2705, "眩": 2706, "眶": 2707, "眷": 2708, "眺": 2709, "眼": 2710, "着": 2711, "睁": 2712, "睐": 2713, "睛": 2714, "睡": 2715, "睢": 2716, "督": 2717, "睦": 2718, "睫": 2719, "睹": 2720, "睽": 2721, "睾": 2722, "睿": 2723, "瞄": 2724, "瞎": 2725, "瞒": 2726, "瞧": 2727, "瞩": 2728, "瞬": 2729, "瞭": 2730, "瞳": 2731, "瞻": 2732, "瞽": 2733, "瞿": 2734, "矍": 2735, "矗": 2736, "矛": 2737, "矢": 2738, "矣": 2739, "知": 2740, "矩": 2741, "矫": 2742, "短": 2743, "矮": 2744, "石": 2745, "矶": 2746, "矾": 2747, "矿": 2748, "砀": 2749, "码": 2750, "砂": 2751, "砌": 2752, "砍": 2753, "砒": 2754, "研": 2755, "砖": 2756, "砗": 2757, "砚": 2758, "砥": 2759, "破": 2760, "砵": 2761, "砷": 2762, "砸": 2763, "砻": 2764, "砾": 2765, "础": 2766, "硅": 2767, "硒": 2768, "硕": 2769, "硖": 2770, "硝": 2771, "硫": 2772, "硬": 2773, "确": 2774, "硼": 2775, "碉": 2776, "碍": 2777, "碎": 2778, "碑": 2779, "碗": 2780, "碘": 2781, "碟": 2782, "碧": 2783, "碰": 2784, "碱": 2785, "碲": 2786, "碳": 2787, "碾": 2788, "磁": 2789, "磅": 2790, "磐": 2791, "磡": 2792, "磨": 2793, "磲": 2794, "磷": 2795, "礁": 2796, "示": 2797, "礼": 2798, "社": 2799, "祀": 2800, "祁": 2801, "祈": 2802, "祉": 2803, "祋": 2804, "祎": 2805, "祕": 2806, "祖": 2807, "祗": 2808, "祚": 2809, "祛": 2810, "祜": 2811, "祝": 2812, "神": 2813, "祠": 2814, "祥": 2815, "票": 2816, "祭": 2817, "祯": 2818, "祷": 2819, "祸": 2820, "祺": 2821, "禁": 2822, "禄": 2823, "禅": 2824, "福": 2825, "禑": 2826, "禧": 2827, "禹": 2828, "禺": 2829, "离": 2830, "禾": 2831, "秀": 2832, "私": 2833, "秆": 2834, "秉": 2835, "秋": 2836, "种": 2837, "科": 2838, "秒": 2839, "秘": 2840, "租": 2841, "秤": 2842, "秦": 2843, "秧": 2844, "秩": 2845, "积": 2846, "称": 2847, "移": 2848, "秽": 2849, "稀": 2850, "稃": 2851, "程": 2852, "稍": 2853, "税": 2854, "稔": 2855, "稗": 2856, "稚": 2857, "稞": 2858, "稠": 2859, "稣": 2860, "稳": 2861, "稷": 2862, "稺": 2863, "稻": 2864, "稼": 2865, "稽": 2866, "稿": 2867, "穆": 2868, "穗": 2869, "穴": 2870, "究": 2871, "穷": 2872, "穹": 2873, "空": 2874, "穿": 2875, "突": 2876, "窃": 2877, "窄": 2878, "窈": 2879, "窑": 2880, "窒": 2881, "窕": 2882, "窖": 2883, "窗": 2884, "窜": 2885, "窝": 2886, "窟": 2887, "窥": 2888, "窦": 2889, "竈": 2890, "立": 2891, "竖": 2892, "站": 2893, "竞": 2894, "竟": 2895, "章": 2896, "竣": 2897, "童": 2898, "竭": 2899, "端": 2900, "竹": 2901, "竺": 2902, "竿": 2903, "笃": 2904, "笄": 2905, "笆": 2906, "笋": 2907, "笏": 2908, "笑": 2909, "笔": 2910, "笙": 2911, "笛": 2912, "笞": 2913, "笠": 2914, "符": 2915, "笨": 2916, "第": 2917, "笮": 2918, "笼": 2919, "筅": 2920, "等": 2921, "筋": 2922, "筐": 2923, "筑": 2924, "筒": 2925, "答": 2926, "策": 2927, "筛": 2928, "筮": 2929, "筱": 2930, "筲": 2931, "筷": 2932, "筹": 2933, "签": 2934, "简": 2935, "箕": 2936, "算": 2937, "管": 2938, "箨": 2939, "箩": 2940, "箬": 2941, "箭": 2942, "箱": 2943, "箴": 2944, "篆": 2945, "篇": 2946, "篙": 2947, "篡": 2948, "篦": 2949, "篮": 2950, "篱": 2951, "篷": 2952, "簇": 2953, "簕": 2954, "簧": 2955, "簪": 2956, "簸": 2957, "簽": 2958, "簿": 2959, "籁": 2960, "籍": 2961, "米": 2962, "籴": 2963, "类": 2964, "籽": 2965, "粉": 2966, "粒": 2967, "粗": 2968, "粘": 2969, "粟": 2970, "粤": 2971, "粥": 2972, "粪": 2973, "粮": 2974, "粲": 2975, "粹": 2976, "精": 2977, "糊": 2978, "糕": 2979, "糖": 2980, "糙": 2981, "糟": 2982, "糠": 2983, "糯": 2984, "系": 2985, "紊": 2986, "紑": 2987, "素": 2988, "索": 2989, "紧": 2990, "紫": 2991, "累": 2992, "絮": 2993, "綖": 2994, "綦": 2995, "緁": 2996, "縻": 2997, "繁": 2998, "纂": 2999, "纠": 3000, "红": 3001, "纤": 3002, "纥": 3003, "约": 3004, "级": 3005, "纪": 3006, "纬": 3007, "纭": 3008, "纮": 3009, "纯": 3010, "纱": 3011, "纲": 3012, "纳": 3013, "纵": 3014, "纶": 3015, "纷": 3016, "纸": 3017, "纹": 3018, "纺": 3019, "纻": 3020, "纽": 3021, "线": 3022, "绂": 3023, "练": 3024, "组": 3025, "绅": 3026, "细": 3027, "织": 3028, "终": 3029, "绊": 3030, "绍": 3031, "绎": 3032, "经": 3033, "绑": 3034, "绒": 3035, "结": 3036, "绕": 3037, "绘": 3038, "给": 3039, "绚": 3040, "绛": 3041, "络": 3042, "绝": 3043, "绞": 3044, "统": 3045, "绡": 3046, "绢": 3047, "绣": 3048, "绥": 3049, "绦": 3050, "继": 3051, "绩": 3052, "绪": 3053, "绫": 3054, "续": 3055, "绮": 3056, "绯": 3057, "绰": 3058, "绳": 3059, "维": 3060, "绵": 3061, "绶": 3062, "绸": 3063, "综": 3064, "绽": 3065, "绿": 3066, "缀": 3067, "缅": 3068, "缆": 3069, "缇": 3070, "缉": 3071, "缓": 3072, "缔": 3073, "缕": 3074, "编": 3075, "缘": 3076, "缙": 3077, "缚": 3078, "缜": 3079, "缝": 3080, "缠": 3081, "缨": 3082, "缩": 3083, "缪": 3084, "缮": 3085, "缴": 3086, "缵": 3087, "缸": 3088, "缺": 3089, "罂": 3090, "罄": 3091, "罐": 3092, "网": 3093, "罔": 3094, "罕": 3095, "罗": 3096, "罘": 3097, "罚": 3098, "罢": 3099, "罩": 3100, "罪": 3101, "置": 3102, "署": 3103, "罹": 3104, "罽": 3105, "羁": 3106, "羊": 3107, "羌": 3108, "美": 3109, "羚": 3110, "羞": 3111, "羟": 3112, "羡": 3113, "群": 3114, "羧": 3115, "羯": 3116, "羰": 3117, "羲": 3118, "羽": 3119, "翁": 3120, "翃": 3121, "翅": 3122, "翊": 3123, "翌": 3124, "翎": 3125, "翔": 3126, "翘": 3127, "翟": 3128, "翠": 3129, "翡": 3130, "翥": 3131, "翦": 3132, "翰": 3133, "翱": 3134, "翻": 3135, "翼": 3136, "翽": 3137, "耀": 3138, "老": 3139, "考": 3140, "者": 3141, "耆": 3142, "而": 3143, "耍": 3144, "耐": 3145, "耕": 3146, "耗": 3147, "耘": 3148, "耦": 3149, "耧": 3150, "耨": 3151, "耳": 3152, "耶": 3153, "耸": 3154, "耻": 3155, "耽": 3156, "耿": 3157, "聂": 3158, "聆": 3159, "聊": 3160, "聋": 3161, "职": 3162, "联": 3163, "聘": 3164, "聚": 3165, "聪": 3166, "聿": 3167, "肃": 3168, "肄": 3169, "肆": 3170, "肇": 3171, "肉": 3172, "肋": 3173, "肌": 3174, "肖": 3175, "肘": 3176, "肚": 3177, "肛": 3178, "肝": 3179, "肟": 3180, "肠": 3181, "股": 3182, "肢": 3183, "肤": 3184, "肥": 3185, "肩": 3186, "肪": 3187, "肯": 3188, "肱": 3189, "育": 3190, "肴": 3191, "肺": 3192, "肼": 3193, "肽": 3194, "肾": 3195, "肿": 3196, "胀": 3197, "胁": 3198, "胃": 3199, "胄": 3200, "胆": 3201, "背": 3202, "胍": 3203, "胎": 3204, "胖": 3205, "胚": 3206, "胛": 3207, "胜": 3208, "胞": 3209, "胡": 3210, "胤": 3211, "胥": 3212, "胪": 3213, "胫": 3214, "胭": 3215, "胰": 3216, "胱": 3217, "胳": 3218, "胶": 3219, "胸": 3220, "胺": 3221, "胼": 3222, "能": 3223, "脂": 3224, "脆": 3225, "脉": 3226, "脊": 3227, "脏": 3228, "脐": 3229, "脑": 3230, "脓": 3231, "脖": 3232, "脚": 3233, "脱": 3234, "脲": 3235, "脸": 3236, "脾": 3237, "腈": 3238, "腊": 3239, "腌": 3240, "腐": 3241, "腓": 3242, "腔": 3243, "腕": 3244, "腥": 3245, "腧": 3246, "腭": 3247, "腮": 3248, "腰": 3249, "腱": 3250, "腹": 3251, "腺": 3252, "腻": 3253, "腾": 3254, "腿": 3255, "膀": 3256, "膈": 3257, "膊": 3258, "膏": 3259, "膑": 3260, "膛": 3261, "膜": 3262, "膝": 3263, "膦": 3264, "膨": 3265, "膳": 3266, "膺": 3267, "膻": 3268, "臀": 3269, "臂": 3270, "臣": 3271, "臧": 3272, "自": 3273, "臭": 3274, "臯": 3275, "至": 3276, "致": 3277, "臻": 3278, "臼": 3279, "舄": 3280, "舅": 3281, "舆": 3282, "舌": 3283, "舍": 3284, "舒": 3285, "舘": 3286, "舜": 3287, "舞": 3288, "舟": 3289, "航": 3290, "舫": 3291, "般": 3292, "舰": 3293, "舱": 3294, "舶": 3295, "船": 3296, "艇": 3297, "艘": 3298, "艮": 3299, "良": 3300, "艰": 3301, "色": 3302, "艳": 3303, "艺": 3304, "艾": 3305, "节": 3306, "芃": 3307, "芈": 3308, "芊": 3309, "芋": 3310, "芎": 3311, "芒": 3312, "芗": 3313, "芙": 3314, "芜": 3315, "芝": 3316, "芥": 3317, "芦": 3318, "芩": 3319, "芪": 3320, "芬": 3321, "芭": 3322, "芮": 3323, "芯": 3324, "芰": 3325, "花": 3326, "芳": 3327, "芷": 3328, "芸": 3329, "芹": 3330, "芽": 3331, "苁": 3332, "苄": 3333, "苇": 3334, "苈": 3335, "苋": 3336, "苌": 3337, "苍": 3338, "苎": 3339, "苏": 3340, "苑": 3341, "苓": 3342, "苔": 3343, "苗": 3344, "苛": 3345, "苞": 3346, "苟": 3347, "苡": 3348, "苣": 3349, "若": 3350, "苦": 3351, "苯": 3352, "英": 3353, "苳": 3354, "苴": 3355, "苷": 3356, "苹": 3357, "苻": 3358, "苾": 3359, "茂": 3360, "范": 3361, "茄": 3362, "茅": 3363, "茉": 3364, "茌": 3365, "茎": 3366, "茔": 3367, "茛": 3368, "茜": 3369, "茧": 3370, "茨": 3371, "茫": 3372, "茯": 3373, "茱": 3374, "茵": 3375, "茶": 3376, "茸": 3377, "茹": 3378, "荀": 3379, "荁": 3380, "荃": 3381, "荆": 3382, "草": 3383, "荐": 3384, "荒": 3385, "荔": 3386, "荚": 3387, "荛": 3388, "荞": 3389, "荠": 3390, "荡": 3391, "荣": 3392, "荥": 3393, "荦": 3394, "荧": 3395, "荨": 3396, "荩": 3397, "荪": 3398, "荫": 3399, "药": 3400, "荷": 3401, "荸": 3402, "荼": 3403, "荽": 3404, "莅": 3405, "莆": 3406, "莉": 3407, "莎": 3408, "莒": 3409, "莓": 3410, "莘": 3411, "莞": 3412, "莩": 3413, "莪": 3414, "莫": 3415, "莱": 3416, "莲": 3417, "莴": 3418, "获": 3419, "莸": 3420, "莹": 3421, "莺": 3422, "莼": 3423, "莽": 3424, "菀": 3425, "菁": 3426, "菅": 3427, "菇": 3428, "菉": 3429, "菊": 3430, "菌": 3431, "菏": 3432, "菖": 3433, "菜": 3434, "菝": 3435, "菠": 3436, "菩": 3437, "菰": 3438, "菱": 3439, "菲": 3440, "萁": 3441, "萄": 3442, "萌": 3443, "萍": 3444, "萎": 3445, "萘": 3446, "萜": 3447, "萝": 3448, "萤": 3449, "营": 3450, "萦": 3451, "萧": 3452, "萨": 3453, "萱": 3454, "萸": 3455, "萼": 3456, "落": 3457, "葆": 3458, "葎": 3459, "著": 3460, "葛": 3461, "葜": 3462, "葡": 3463, "董": 3464, "葫": 3465, "葬": 3466, "葱": 3467, "葳": 3468, "葵": 3469, "葶": 3470, "蒂": 3471, "蒋": 3472, "蒙": 3473, "蒜": 3474, "蒟": 3475, "蒲": 3476, "蒴": 3477, "蒸": 3478, "蒺": 3479, "蒿": 3480, "蓄": 3481, "蓉": 3482, "蓍": 3483, "蓝": 3484, "蓟": 3485, "蓣": 3486, "蓬": 3487, "蓼": 3488, "蔑": 3489, "蔓": 3490, "蔗": 3491, "蔚": 3492, "蔡": 3493, "蔬": 3494, "蔵": 3495, "蔷": 3496, "蔺": 3497, "蔻": 3498, "蔽": 3499, "蕃": 3500, "蕈": 3501, "蕉": 3502, "蕊": 3503, "蕨": 3504, "蕲": 3505, "蕴": 3506, "蕾": 3507, "薄": 3508, "薇": 3509, "薖": 3510, "薛": 3511, "薨": 3512, "薪": 3513, "薮": 3514, "薯": 3515, "薹": 3516, "藁": 3517, "藉": 3518, "藏": 3519, "藓": 3520, "藔": 3521, "藕": 3522, "藜": 3523, "藤": 3524, "藨": 3525, "藩": 3526, "藳": 3527, "藻": 3528, "藿": 3529, "蘑": 3530, "蘸": 3531, "虎": 3532, "虏": 3533, "虐": 3534, "虑": 3535, "虔": 3536, "虚": 3537, "虞": 3538, "虫": 3539, "虬": 3540, "虱": 3541, "虹": 3542, "虻": 3543, "虽": 3544, "虾": 3545, "蚀": 3546, "蚁": 3547, "蚂": 3548, "蚊": 3549, "蚌": 3550, "蚓": 3551, "蚕": 3552, "蚜": 3553, "蚣": 3554, "蚤": 3555, "蚨": 3556, "蚪": 3557, "蚬": 3558, "蚯": 3559, "蚶": 3560, "蚺": 3561, "蛄": 3562, "蛇": 3563, "蛉": 3564, "蛊": 3565, "蛋": 3566, "蛎": 3567, "蛏": 3568, "蛙": 3569, "蛛": 3570, "蛤": 3571, "蛭": 3572, "蛮": 3573, "蛱": 3574, "蛲": 3575, "蛳": 3576, "蛸": 3577, "蛹": 3578, "蛾": 3579, "蜀": 3580, "蜂": 3581, "蜃": 3582, "蜈": 3583, "蜊": 3584, "蜍": 3585, "蜑": 3586, "蜒": 3587, "蜓": 3588, "蜕": 3589, "蜗": 3590, "蜘": 3591, "蜚": 3592, "蜜": 3593, "蜡": 3594, "蜢": 3595, "蜥": 3596, "蜱": 3597, "蜴": 3598, "蜻": 3599, "蜿": 3600, "蝇": 3601, "蝉": 3602, "蝌": 3603, "蝎": 3604, "蝙": 3605, "蝠": 3606, "蝥": 3607, "蝴": 3608, "蝶": 3609, "蝽": 3610, "蝾": 3611, "螂": 3612, "螃": 3613, "螈": 3614, "融": 3615, "螟": 3616, "螨": 3617, "螭": 3618, "螯": 3619, "螺": 3620, "蟀": 3621, "蟆": 3622, "蟋": 3623, "蟑": 3624, "蟒": 3625, "蟳": 3626, "蟹": 3627, "蟾": 3628, "蠋": 3629, "蠓": 3630, "蠕": 3631, "蠡": 3632, "蠹": 3633, "血": 3634, "衅": 3635, "行": 3636, "衍": 3637, "衔": 3638, "街": 3639, "衙": 3640, "衡": 3641, "衢": 3642, "衣": 3643, "补": 3644, "表": 3645, "衫": 3646, "衬": 3647, "衰": 3648, "衷": 3649, "袁": 3650, "袋": 3651, "袍": 3652, "袒": 3653, "袓": 3654, "袖": 3655, "袗": 3656, "袜": 3657, "被": 3658, "袭": 3659, "裁": 3660, "裂": 3661, "装": 3662, "裔": 3663, "裕": 3664, "裘": 3665, "裙": 3666, "裤": 3667, "裬": 3668, "裴": 3669, "裸": 3670, "裹": 3671, "裾": 3672, "褐": 3673, "褒": 3674, "褚": 3675, "褧": 3676, "褪": 3677, "褶": 3678, "襄": 3679, "襟": 3680, "西": 3681, "要": 3682, "覃": 3683, "覆": 3684, "见": 3685, "观": 3686, "规": 3687, "觅": 3688, "视": 3689, "览": 3690, "觉": 3691, "觐": 3692, "觑": 3693, "角": 3694, "觚": 3695, "解": 3696, "触": 3697, "言": 3698, "詥": 3699, "詹": 3700, "誉": 3701, "誓": 3702, "諡": 3703, "諲": 3704, "謇": 3705, "警": 3706, "譬": 3707, "讚": 3708, "计": 3709, "订": 3710, "讣": 3711, "认": 3712, "讨": 3713, "让": 3714, "讪": 3715, "讫": 3716, "训": 3717, "议": 3718, "讯": 3719, "记": 3720, "讲": 3721, "讳": 3722, "讶": 3723, "讷": 3724, "许": 3725, "讹": 3726, "论": 3727, "讼": 3728, "讽": 3729, "设": 3730, "访": 3731, "诀": 3732, "证": 3733, "诃": 3734, "评": 3735, "识": 3736, "诈": 3737, "诉": 3738, "诊": 3739, "词": 3740, "诏": 3741, "译": 3742, "试": 3743, "诗": 3744, "诙": 3745, "诚": 3746, "诛": 3747, "话": 3748, "诞": 3749, "诟": 3750, "诠": 3751, "诡": 3752, "询": 3753, "诣": 3754, "该": 3755, "详": 3756, "诬": 3757, "语": 3758, "误": 3759, "诰": 3760, "诱": 3761, "说": 3762, "诵": 3763, "请": 3764, "诸": 3765, "诹": 3766, "诺": 3767, "读": 3768, "诽": 3769, "课": 3770, "谁": 3771, "调": 3772, "谅": 3773, "谈": 3774, "谊": 3775, "谋": 3776, "谌": 3777, "谍": 3778, "谎": 3779, "谏": 3780, "谐": 3781, "谒": 3782, "谓": 3783, "谕": 3784, "谗": 3785, "谚": 3786, "谛": 3787, "谜": 3788, "谟": 3789, "谠": 3790, "谢": 3791, "谣": 3792, "谤": 3793, "谥": 3794, "谦": 3795, "谨": 3796, "谪": 3797, "谬": 3798, "谭": 3799, "谯": 3800, "谱": 3801, "谲": 3802, "谴": 3803, "谶": 3804, "谷": 3805, "豁": 3806, "豆": 3807, "豇": 3808, "豌": 3809, "豚": 3810, "象": 3811, "豪": 3812, "豫": 3813, "豹": 3814, "貂": 3815, "貊": 3816, "貌": 3817, "貘": 3818, "贝": 3819, "贞": 3820, "负": 3821, "贡": 3822, "财": 3823, "责": 3824, "贤": 3825, "败": 3826, "账": 3827, "货": 3828, "质": 3829, "贩": 3830, "贪": 3831, "贫": 3832, "贬": 3833, "购": 3834, "贮": 3835, "贯": 3836, "贰": 3837, "贴": 3838, "贵": 3839, "贷": 3840, "贸": 3841, "费": 3842, "贺": 3843, "贻": 3844, "贼": 3845, "贾": 3846, "贿": 3847, "赁": 3848, "赂": 3849, "赃": 3850, "资": 3851, "赈": 3852, "赉": 3853, "赋": 3854, "赌": 3855, "赎": 3856, "赏": 3857, "赐": 3858, "赓": 3859, "赔": 3860, "赖": 3861, "赘": 3862, "赚": 3863, "赛": 3864, "赝": 3865, "赞": 3866, "赟": 3867, "赠": 3868, "赢": 3869, "赣": 3870, "赤": 3871, "赦": 3872, "赫": 3873, "赭": 3874, "走": 3875, "赴": 3876, "赵": 3877, "赶": 3878, "起": 3879, "趁": 3880, "超": 3881, "越": 3882, "趋": 3883, "趟": 3884, "趣": 3885, "足": 3886, "趴": 3887, "趺": 3888, "趾": 3889, "跃": 3890, "跆": 3891, "跋": 3892, "跌": 3893, "跑": 3894, "跖": 3895, "跗": 3896, "跚": 3897, "距": 3898, "跟": 3899, "跨": 3900, "跪": 3901, "路": 3902, "跳": 3903, "践": 3904, "跻": 3905, "踊": 3906, "踏": 3907, "踢": 3908, "踩": 3909, "踪": 3910, "蹄": 3911, "蹈": 3912, "蹒": 3913, "蹴": 3914, "蹶": 3915, "身": 3916, "躯": 3917, "躲": 3918, "车": 3919, "轧": 3920, "轨": 3921, "轩": 3922, "转": 3923, "轭": 3924, "轮": 3925, "软": 3926, "轰": 3927, "轲": 3928, "轴": 3929, "轶": 3930, "轸": 3931, "轻": 3932, "载": 3933, "轿": 3934, "较": 3935, "辅": 3936, "辆": 3937, "辇": 3938, "辈": 3939, "辉": 3940, "辍": 3941, "辐": 3942, "辑": 3943, "输": 3944, "辕": 3945, "辖": 3946, "辗": 3947, "辙": 3948, "辛": 3949, "辜": 3950, "辞": 3951, "辟": 3952, "辣": 3953, "辨": 3954, "辩": 3955, "辰": 3956, "辱": 3957, "边": 3958, "辻": 3959, "込": 3960, "辽": 3961, "达": 3962, "迁": 3963, "迄": 3964, "迅": 3965, "过": 3966, "迈": 3967, "迎": 3968, "运": 3969, "近": 3970, "返": 3971, "还": 3972, "这": 3973, "进": 3974, "远": 3975, "违": 3976, "连": 3977, "迟": 3978, "迥": 3979, "迦": 3980, "迪": 3981, "迫": 3982, "迭": 3983, "述": 3984, "迷": 3985, "迹": 3986, "追": 3987, "退": 3988, "送": 3989, "适": 3990, "逃": 3991, "逅": 3992, "逆": 3993, "选": 3994, "逊": 3995, "逍": 3996, "透": 3997, "逐": 3998, "递": 3999, "途": 4000, "逗": 4001, "通": 4002, "逛": 4003, "逝": 4004, "速": 4005, "造": 4006, "逡": 4007, "逢": 4008, "逮": 4009, "逵": 4010, "逸": 4011, "逻": 4012, "逼": 4013, "逾": 4014, "遁": 4015, "遂": 4016, "遇": 4017, "遍": 4018, "遏": 4019, "遐": 4020, "遑": 4021, "道": 4022, "遗": 4023, "遣": 4024, "遥": 4025, "遭": 4026, "遮": 4027, "遴": 4028, "遵": 4029, "避": 4030, "邀": 4031, "邂": 4032, "邃": 4033, "邈": 4034, "邑": 4035, "邓": 4036, "邕": 4037, "邢": 4038, "那": 4039, "邦": 4040, "邨": 4041, "邪": 4042, "邬": 4043, "邮": 4044, "邯": 4045, "邰": 4046, "邱": 4047, "邳": 4048, "邵": 4049, "邸": 4050, "邹": 4051, "邺": 4052, "邻": 4053, "郁": 4054, "郃": 4055, "郈": 4056, "郊": 4057, "郎": 4058, "郏": 4059, "郑": 4060, "郓": 4061, "郝": 4062, "郡": 4063, "郤": 4064, "郦": 4065, "部": 4066, "郭": 4067, "郯": 4068, "郴": 4069, "郸": 4070, "都": 4071, "郾": 4072, "郿": 4073, "鄂": 4074, "鄞": 4075, "鄢": 4076, "鄱": 4077, "酃": 4078, "酉": 4079, "酋": 4080, "酌": 4081, "配": 4082, "酐": 4083, "酒": 4084, "酗": 4085, "酚": 4086, "酢": 4087, "酤": 4088, "酥": 4089, "酪": 4090, "酬": 4091, "酮": 4092, "酯": 4093, "酰": 4094, "酱": 4095, "酵": 4096, "酶": 4097, "酷": 4098, "酸": 4099, "酿": 4100, "醇": 4101, "醉": 4102, "醋": 4103, "醒": 4104, "醚": 4105, "醛": 4106, "醮": 4107, "醯": 4108, "采": 4109, "釉": 4110, "释": 4111, "里": 4112, "重": 4113, "野": 4114, "量": 4115, "金": 4116, "釜": 4117, "鉴": 4118, "銮": 4119, "鋆": 4120, "鋐": 4121, "鍊": 4122, "鎏": 4123, "鏊": 4124, "鑫": 4125, "针": 4126, "钉": 4127, "钊": 4128, "钌": 4129, "钍": 4130, "钓": 4131, "钕": 4132, "钗": 4133, "钙": 4134, "钛": 4135, "钜": 4136, "钝": 4137, "钞": 4138, "钟": 4139, "钠": 4140, "钡": 4141, "钢": 4142, "钥": 4143, "钦": 4144, "钧": 4145, "钨": 4146, "钩": 4147, "钪": 4148, "钫": 4149, "钬": 4150, "钮": 4151, "钯": 4152, "钰": 4153, "钱": 4154, "钲": 4155, "钴": 4156, "钵": 4157, "钹": 4158, "钻": 4159, "钼": 4160, "钾": 4161, "钿": 4162, "铀": 4163, "铁": 4164, "铂": 4165, "铃": 4166, "铅": 4167, "铆": 4168, "铉": 4169, "铊": 4170, "铋": 4171, "铎": 4172, "铑": 4173, "铜": 4174, "铝": 4175, "铟": 4176, "铠": 4177, "铣": 4178, "铨": 4179, "铪": 4180, "铬": 4181, "铭": 4182, "铮": 4183, "铯": 4184, "铰": 4185, "铱": 4186, "铲": 4187, "铳": 4188, "铵": 4189, "银": 4190, "铸": 4191, "铺": 4192, "铼": 4193, "铽": 4194, "链": 4195, "铿": 4196, "销": 4197, "锁": 4198, "锂": 4199, "锅": 4200, "锆": 4201, "锈": 4202, "锉": 4203, "锋": 4204, "锌": 4205, "锎": 4206, "锐": 4207, "锑": 4208, "锗": 4209, "错": 4210, "锚": 4211, "锡": 4212, "锣": 4213, "锤": 4214, "锥": 4215, "锦": 4216, "锫": 4217, "键": 4218, "锯": 4219, "锰": 4220, "锺": 4221, "锻": 4222, "镀": 4223, "镁": 4224, "镂": 4225, "镇": 4226, "镉": 4227, "镊": 4228, "镍": 4229, "镎": 4230, "镐": 4231, "镒": 4232, "镓": 4233, "镖": 4234, "镗": 4235, "镛": 4236, "镜": 4237, "镠": 4238, "镤": 4239, "镧": 4240, "镰": 4241, "镳": 4242, "镶": 4243, "长": 4244, "閒": 4245, "闍": 4246, "门": 4247, "闪": 4248, "闫": 4249, "闭": 4250, "问": 4251, "闯": 4252, "闰": 4253, "闱": 4254, "闲": 4255, "闳": 4256, "间": 4257, "闵": 4258, "闸": 4259, "闹": 4260, "闻": 4261, "闼": 4262, "闽": 4263, "闾": 4264, "阀": 4265, "阁": 4266, "阅": 4267, "阇": 4268, "阈": 4269, "阉": 4270, "阎": 4271, "阏": 4272, "阐": 4273, "阑": 4274, "阔": 4275, "阕": 4276, "阖": 4277, "阙": 4278, "阜": 4279, "队": 4280, "阪": 4281, "阮": 4282, "阱": 4283, "防": 4284, "阳": 4285, "阴": 4286, "阵": 4287, "阶": 4288, "阻": 4289, "阿": 4290, "陀": 4291, "陁": 4292, "陂": 4293, "附": 4294, "际": 4295, "陆": 4296, "陇": 4297, "陈": 4298, "陋": 4299, "陌": 4300, "降": 4301, "限": 4302, "陕": 4303, "陛": 4304, "陟": 4305, "陡": 4306, "院": 4307, "除": 4308, "陨": 4309, "险": 4310, "陪": 4311, "陲": 4312, "陵": 4313, "陶": 4314, "陷": 4315, "隅": 4316, "隆": 4317, "隈": 4318, "隋": 4319, "隍": 4320, "随": 4321, "隐": 4322, "隔": 4323, "隗": 4324, "隘": 4325, "隙": 4326, "障": 4327, "隧": 4328, "隶": 4329, "隼": 4330, "难": 4331, "雀": 4332, "雁": 4333, "雄": 4334, "雅": 4335, "集": 4336, "雇": 4337, "雉": 4338, "雌": 4339, "雍": 4340, "雏": 4341, "雑": 4342, "雒": 4343, "雕": 4344, "雨": 4345, "雪": 4346, "雯": 4347, "雳": 4348, "零": 4349, "雷": 4350, "雹": 4351, "雾": 4352, "需": 4353, "霄": 4354, "霆": 4355, "震": 4356, "霈": 4357, "霉": 4358, "霍": 4359, "霑": 4360, "霓": 4361, "霖": 4362, "霜": 4363, "霞": 4364, "霰": 4365, "露": 4366, "霸": 4367, "霹": 4368, "青": 4369, "靓": 4370, "靖": 4371, "静": 4372, "靛": 4373, "非": 4374, "靠": 4375, "靡": 4376, "面": 4377, "革": 4378, "靳": 4379, "靴": 4380, "靶": 4381, "靺": 4382, "靼": 4383, "鞅": 4384, "鞋": 4385, "鞍": 4386, "鞑": 4387, "鞘": 4388, "鞣": 4389, "鞨": 4390, "鞭": 4391, "韦": 4392, "韧": 4393, "韩": 4394, "韫": 4395, "韬": 4396, "韭": 4397, "音": 4398, "韵": 4399, "韶": 4400, "頴": 4401, "页": 4402, "顶": 4403, "顷": 4404, "项": 4405, "顺": 4406, "须": 4407, "顽": 4408, "顾": 4409, "顿": 4410, "颁": 4411, "颂": 4412, "预": 4413, "颅": 4414, "领": 4415, "颇": 4416, "颈": 4417, "��": 4418, "颊": 4419, "颌": 4420, "颍": 4421, "颐": 4422, "频": 4423, "颖": 4424, "颗": 4425, "题": 4426, "颚": 4427, "颜": 4428, "额": 4429, "颞": 4430, "颠": 4431, "颤": 4432, "风": 4433, "飒": 4434, "飓": 4435, "飘": 4436, "飙": 4437, "飞": 4438, "食": 4439, "餐": 4440, "餵": 4441, "饥": 4442, "饪": 4443, "饬": 4444, "饭": 4445, "饮": 4446, "饯": 4447, "饰": 4448, "饱": 4449, "饲": 4450, "饴": 4451, "饵": 4452, "饶": 4453, "饷": 4454, "饼": 4455, "饿": 4456, "馀": 4457, "馅": 4458, "馆": 4459, "馈": 4460, "馔": 4461, "首": 4462, "香": 4463, "馥": 4464, "馨": 4465, "馯": 4466, "马": 4467, "驭": 4468, "驮": 4469, "驯": 4470, "驰": 4471, "驱": 4472, "驳": 4473, "驴": 4474, "驶": 4475, "驷": 4476, "驸": 4477, "驹": 4478, "驻": 4479, "驼": 4480, "驾": 4481, "驿": 4482, "骁": 4483, "骂": 4484, "骄": 4485, "骆": 4486, "骈": 4487, "验": 4488, "骏": 4489, "骑": 4490, "骗": 4491, "骘": 4492, "骚": 4493, "骠": 4494, "骤": 4495, "骥": 4496, "骨": 4497, "骰": 4498, "骷": 4499, "骸": 4500, "骼": 4501, "髅": 4502, "髎": 4503, "髓": 4504, "高": 4505, "髻": 4506, "鬃": 4507, "鬣": 4508, "鬼": 4509, "魁": 4510, "魂": 4511, "魄": 4512, "魅": 4513, "魏": 4514, "魔": 4515, "鮎": 4516, "鱼": 4517, "鱿": 4518, "鲀": 4519, "鲁": 4520, "鲃": 4521, "鲇": 4522, "鲈": 4523, "鲍": 4524, "鲎": 4525, "鲑": 4526, "鲛": 4527, "鲜": 4528, "鲡": 4529, "鲢": 4530, "鲣": 4531, "鲤": 4532, "鲨": 4533, "鲫": 4534, "鲲": 4535, "鲳": 4536, "鲴": 4537, "鲷": 4538, "鲸": 4539, "鲹": 4540, "鲻": 4541, "鲼": 4542, "鲾": 4543, "鳃": 4544, "鳄": 4545, "鳅": 4546, "鳌": 4547, "鳍": 4548, "鳐": 4549, "鳔": 4550, "鳕": 4551, "鳖": 4552, "鳗": 4553, "鳞": 4554, "鳟": 4555, "鳢": 4556, "鸟": 4557, "鸠": 4558, "鸡": 4559, "鸢": 4560, "鸣": 4561, "鸥": 4562, "鸦": 4563, "鸩": 4564, "鸭": 4565, "鸮": 4566, "鸯": 4567, "鸱": 4568, "鸲": 4569, "鸵": 4570, "鸻": 4571, "鸽": 4572, "鸾": 4573, "鸿": 4574, "鹀": 4575, "鹃": 4576, "鹅": 4577, "鹉": 4578, "鹊": 4579, "鹋": 4580, "鹏": 4581, "鹑": 4582, "鹘": 4583, "鹤": 4584, "鹦": 4585, "鹩": 4586, "鹪": 4587, "鹫": 4588, "鹬": 4589, "鹭": 4590, "鹰": 4591, "鹳": 4592, "鹿": 4593, "麂": 4594, "麃": 4595, "麋": 4596, "麒": 4597, "麓": 4598, "麝": 4599, "麟": 4600, "麦": 4601, "麻": 4602, "麾": 4603, "黄": 4604, "黍": 4605, "黎": 4606, "黏": 4607, "黑": 4608, "黔": 4609, "默": 4610, "黛": 4611, "黜": 4612, "黧": 4613, "黯": 4614, "黻": 4615, "黼": 4616, "黾": 4617, "鼎": 4618, "鼐": 4619, "鼓": 4620, "鼠": 4621, "鼢": 4622, "鼩": 4623, "鼬": 4624, "鼱": 4625, "鼷": 4626, "鼹": 4627, "鼻": 4628, "齐": 4629, "齿": 4630, "龄": 4631, "龈": 4632, "龙": 4633, "龚": 4634, "龛": 4635, "龟": 4636, "龢": 4637, "!": 4638, "(": 4639, ")": 4640, ",": 4641, "-": 4642, "/": 4643, ":": 4644, ";": 4645, "?": 4646, "p": 4647, "|": 0, "[UNK]": 4648, "[PAD]": 4649}