Update
Browse files- README.md +80 -3
- all_results.json +17 -0
- config.json +37 -0
- eval_results.json +11 -0
- generation_config.json +10 -0
- model.safetensors +3 -0
- special_tokens_map.json +8 -0
- tokenizer.json +241 -0
- tokenizer_config.json +53 -0
- train_results.json +9 -0
- trainer_state.json +3055 -0
- training_args.bin +3 -0
README.md
CHANGED
@@ -1,3 +1,80 @@
|
|
1 |
-
---
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- aehrm/dtaec-lexica
|
4 |
+
language: de
|
5 |
+
---
|
6 |
+
|
7 |
+
# DTAEC Type Normalizer
|
8 |
+
|
9 |
+
This model is trained from scratch to normalize historic spelling of German to contemporary one. It is type-based, which means that it takes only a single token (without whitespace) as input, and generates the normalized variant.
|
10 |
+
It achieves the following results on the evaluation set:
|
11 |
+
- Loss: 0.0308
|
12 |
+
- Wordacc: 0.9546
|
13 |
+
- Wordacc Oov: 0.9096
|
14 |
+
|
15 |
+
Note: This model is part of a larger system, which uses an additional GPT-based model to disambiguate different normalization forms by taking in the full context.
|
16 |
+
|
17 |
+
## Training and evaluation data
|
18 |
+
|
19 |
+
The model has been trained on the DTA-EC Parallel Corpus Lexicon ([aehrm/dtaec-lexica](https://huggingface.co/datasets/aehrm/dtaec-lexica)), which is from a [parallel corpus](https://kaskade.dwds.de/~moocow/software/dtaec/) of the Deutsche Textarchiv (German Text Archive), who aligned historic prints of documents with their moden editions in contemporary orthography.
|
20 |
+
|
21 |
+
Training was done on type-level, where, given the historic form of a type, the model must predict the corresponding normalized type *that appeared most frequent in the parallel corpus*.
|
22 |
+
|
23 |
+
## Demo Usage
|
24 |
+
|
25 |
+
```python
|
26 |
+
from transformers import AutoTokenizer, AutoModel
|
27 |
+
|
28 |
+
tokenizer = AutoTokenizer.from_pretrained('aehrm/dtaec-type-normalizer')
|
29 |
+
model = AutoModel.from_pretrained('aehrm/dtaec-type-normalizer')
|
30 |
+
|
31 |
+
model_in = tokenizer([['Freyheit'], ['seyn'], ['selbstthätig']])
|
32 |
+
model_out = model(**model_in)
|
33 |
+
|
34 |
+
print(tokenizer.decode_batch(model_out))
|
35 |
+
```
|
36 |
+
|
37 |
+
|
38 |
+
## Training hyperparameters
|
39 |
+
|
40 |
+
The following hyperparameters were used during training:
|
41 |
+
- learning_rate: 0.0001
|
42 |
+
- train_batch_size: 8
|
43 |
+
- eval_batch_size: 64
|
44 |
+
- seed: 12345
|
45 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
46 |
+
- lr_scheduler_type: linear
|
47 |
+
- num_epochs: 20
|
48 |
+
|
49 |
+
## Training results
|
50 |
+
|
51 |
+
| Training Loss | Epoch | Step | Validation Loss | Wordacc | Wordacc Oov | Gen Len |
|
52 |
+
|:-------------:|:-----:|:------:|:---------------:|:-------:|:-----------:|:-------:|
|
53 |
+
| 0.0912 | 1.0 | 12628 | 0.0698 | 0.8984 | 0.8421 | 12.3456 |
|
54 |
+
| 0.0746 | 2.0 | 25256 | 0.0570 | 0.9124 | 0.8584 | 12.3442 |
|
55 |
+
| 0.0622 | 3.0 | 37884 | 0.0493 | 0.9195 | 0.8717 | 12.3512 |
|
56 |
+
| 0.0584 | 4.0 | 50512 | 0.0465 | 0.9221 | 0.8749 | 12.3440 |
|
57 |
+
| 0.0497 | 5.0 | 63140 | 0.0436 | 0.9274 | 0.8821 | 12.3552 |
|
58 |
+
| 0.0502 | 6.0 | 75768 | 0.0411 | 0.9311 | 0.8858 | 12.3519 |
|
59 |
+
| 0.0428 | 7.0 | 88396 | 0.0396 | 0.9336 | 0.8878 | 12.3444 |
|
60 |
+
| 0.0416 | 8.0 | 101024 | 0.0372 | 0.9339 | 0.8887 | 12.3471 |
|
61 |
+
| 0.042 | 9.0 | 113652 | 0.0365 | 0.9396 | 0.8944 | 12.3485 |
|
62 |
+
| 0.0376 | 10.0 | 126280 | 0.0353 | 0.9412 | 0.8962 | 12.3485 |
|
63 |
+
| 0.031 | 11.0 | 138908 | 0.0339 | 0.9439 | 0.9008 | 12.3519 |
|
64 |
+
| 0.0298 | 12.0 | 151536 | 0.0337 | 0.9454 | 0.9013 | 12.3479 |
|
65 |
+
| 0.0302 | 13.0 | 164164 | 0.0322 | 0.9470 | 0.9043 | 12.3483 |
|
66 |
+
| 0.0277 | 14.0 | 176792 | 0.0316 | 0.9479 | 0.9040 | 12.3506 |
|
67 |
+
| 0.0277 | 15.0 | 189420 | 0.0323 | 0.9488 | 0.9030 | 12.3514 |
|
68 |
+
| 0.0245 | 16.0 | 202048 | 0.0314 | 0.9513 | 0.9072 | 12.3501 |
|
69 |
+
| 0.0235 | 17.0 | 214676 | 0.0313 | 0.9520 | 0.9071 | 12.3511 |
|
70 |
+
| 0.0206 | 18.0 | 227304 | 0.0310 | 0.9531 | 0.9084 | 12.3502 |
|
71 |
+
| 0.0178 | 19.0 | 239932 | 0.0307 | 0.9545 | 0.9094 | 12.3507 |
|
72 |
+
| 0.016 | 20.0 | 252560 | 0.0308 | 0.9546 | 0.9096 | 12.3516 |
|
73 |
+
|
74 |
+
|
75 |
+
### Framework versions
|
76 |
+
|
77 |
+
- Transformers 4.41.2
|
78 |
+
- Pytorch 2.3.0+cu121
|
79 |
+
- Datasets 2.19.1
|
80 |
+
- Tokenizers 0.19.1
|
all_results.json
ADDED
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 20.0,
|
3 |
+
"eval_gen_len": 12.35156514,
|
4 |
+
"eval_loss": 0.030767865478992462,
|
5 |
+
"eval_runtime": 534.9614,
|
6 |
+
"eval_samples": 53222,
|
7 |
+
"eval_samples_per_second": 99.488,
|
8 |
+
"eval_steps_per_second": 1.555,
|
9 |
+
"eval_wordacc": 0.95458645,
|
10 |
+
"eval_wordacc_oov": 0.90963293,
|
11 |
+
"total_flos": 1486519029399552.0,
|
12 |
+
"train_loss": 0.04684994914012881,
|
13 |
+
"train_runtime": 19459.2196,
|
14 |
+
"train_samples": 101024,
|
15 |
+
"train_samples_per_second": 103.832,
|
16 |
+
"train_steps_per_second": 12.979
|
17 |
+
}
|
config.json
ADDED
@@ -0,0 +1,37 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"activation_dropout": 0.0,
|
3 |
+
"activation_function": "gelu",
|
4 |
+
"architectures": [
|
5 |
+
"BartForConditionalGeneration"
|
6 |
+
],
|
7 |
+
"attention_dropout": 0.0,
|
8 |
+
"bos_token_id": 1,
|
9 |
+
"classifier_dropout": 0.0,
|
10 |
+
"d_model": 256,
|
11 |
+
"decoder_attention_heads": 4,
|
12 |
+
"decoder_ffn_dim": 1024,
|
13 |
+
"decoder_layerdrop": 0,
|
14 |
+
"decoder_layers": 4,
|
15 |
+
"decoder_start_token_id": 2,
|
16 |
+
"dropout": 0.3,
|
17 |
+
"encoder_attention_heads": 4,
|
18 |
+
"encoder_ffn_dim": 1024,
|
19 |
+
"encoder_layerdrop": 0,
|
20 |
+
"encoder_layers": 4,
|
21 |
+
"eos_token_id": 2,
|
22 |
+
"forced_eos_token_id": 2,
|
23 |
+
"init_std": 0.02,
|
24 |
+
"is_encoder_decoder": true,
|
25 |
+
"max_length": 100,
|
26 |
+
"max_position_embeddings": 1024,
|
27 |
+
"model_type": "bart",
|
28 |
+
"num_beams": 4,
|
29 |
+
"num_hidden_layers": 4,
|
30 |
+
"pad_token_id": 0,
|
31 |
+
"scale_embedding": false,
|
32 |
+
"torch_dtype": "float32",
|
33 |
+
"transformers_version": "4.41.2",
|
34 |
+
"unk_token_id": 3,
|
35 |
+
"use_cache": true,
|
36 |
+
"vocab_size": 122
|
37 |
+
}
|
eval_results.json
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 20.0,
|
3 |
+
"eval_gen_len": 12.35156514,
|
4 |
+
"eval_loss": 0.030767865478992462,
|
5 |
+
"eval_runtime": 534.9614,
|
6 |
+
"eval_samples": 53222,
|
7 |
+
"eval_samples_per_second": 99.488,
|
8 |
+
"eval_steps_per_second": 1.555,
|
9 |
+
"eval_wordacc": 0.95458645,
|
10 |
+
"eval_wordacc_oov": 0.90963293
|
11 |
+
}
|
generation_config.json
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token_id": 1,
|
3 |
+
"decoder_start_token_id": 2,
|
4 |
+
"eos_token_id": 2,
|
5 |
+
"forced_eos_token_id": 2,
|
6 |
+
"max_length": 100,
|
7 |
+
"num_beams": 4,
|
8 |
+
"pad_token_id": 0,
|
9 |
+
"transformers_version": "4.41.2"
|
10 |
+
}
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:002146d5baf5a9676c0748993d7ec275f0e991ca6bd4664ff8c7644d902f77fd
|
3 |
+
size 31741808
|
special_tokens_map.json
ADDED
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token": "<s>",
|
3 |
+
"cls_token": "<s>",
|
4 |
+
"eos_token": "</s>",
|
5 |
+
"pad_token": "<pad>",
|
6 |
+
"sep_token": "</s>",
|
7 |
+
"unk_token": "<unk>"
|
8 |
+
}
|
tokenizer.json
ADDED
@@ -0,0 +1,241 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"version": "1.0",
|
3 |
+
"truncation": {
|
4 |
+
"direction": "Right",
|
5 |
+
"max_length": 100,
|
6 |
+
"strategy": "LongestFirst",
|
7 |
+
"stride": 0
|
8 |
+
},
|
9 |
+
"padding": null,
|
10 |
+
"added_tokens": [
|
11 |
+
{
|
12 |
+
"id": 0,
|
13 |
+
"content": "<pad>",
|
14 |
+
"single_word": false,
|
15 |
+
"lstrip": false,
|
16 |
+
"rstrip": false,
|
17 |
+
"normalized": false,
|
18 |
+
"special": true
|
19 |
+
},
|
20 |
+
{
|
21 |
+
"id": 1,
|
22 |
+
"content": "<s>",
|
23 |
+
"single_word": false,
|
24 |
+
"lstrip": false,
|
25 |
+
"rstrip": false,
|
26 |
+
"normalized": false,
|
27 |
+
"special": true
|
28 |
+
},
|
29 |
+
{
|
30 |
+
"id": 2,
|
31 |
+
"content": "</s>",
|
32 |
+
"single_word": false,
|
33 |
+
"lstrip": false,
|
34 |
+
"rstrip": false,
|
35 |
+
"normalized": false,
|
36 |
+
"special": true
|
37 |
+
},
|
38 |
+
{
|
39 |
+
"id": 3,
|
40 |
+
"content": "<unk>",
|
41 |
+
"single_word": false,
|
42 |
+
"lstrip": false,
|
43 |
+
"rstrip": false,
|
44 |
+
"normalized": false,
|
45 |
+
"special": true
|
46 |
+
}
|
47 |
+
],
|
48 |
+
"normalizer": {
|
49 |
+
"type": "Sequence",
|
50 |
+
"normalizers": [
|
51 |
+
{
|
52 |
+
"type": "Replace",
|
53 |
+
"pattern": {
|
54 |
+
"String": " "
|
55 |
+
},
|
56 |
+
"content": ""
|
57 |
+
},
|
58 |
+
{
|
59 |
+
"type": "Replace",
|
60 |
+
"pattern": {
|
61 |
+
"String": "ſ"
|
62 |
+
},
|
63 |
+
"content": "s"
|
64 |
+
},
|
65 |
+
{
|
66 |
+
"type": "Replace",
|
67 |
+
"pattern": {
|
68 |
+
"String": "aͤ"
|
69 |
+
},
|
70 |
+
"content": "ä"
|
71 |
+
},
|
72 |
+
{
|
73 |
+
"type": "Replace",
|
74 |
+
"pattern": {
|
75 |
+
"String": "oͤ"
|
76 |
+
},
|
77 |
+
"content": "ö"
|
78 |
+
},
|
79 |
+
{
|
80 |
+
"type": "Replace",
|
81 |
+
"pattern": {
|
82 |
+
"String": "uͤ"
|
83 |
+
},
|
84 |
+
"content": "ü"
|
85 |
+
}
|
86 |
+
]
|
87 |
+
},
|
88 |
+
"pre_tokenizer": null,
|
89 |
+
"post_processor": {
|
90 |
+
"type": "RobertaProcessing",
|
91 |
+
"sep": [
|
92 |
+
"</s>",
|
93 |
+
2
|
94 |
+
],
|
95 |
+
"cls": [
|
96 |
+
"<s>",
|
97 |
+
1
|
98 |
+
],
|
99 |
+
"trim_offsets": true,
|
100 |
+
"add_prefix_space": true
|
101 |
+
},
|
102 |
+
"decoder": {
|
103 |
+
"type": "BPEDecoder",
|
104 |
+
"suffix": "</w>"
|
105 |
+
},
|
106 |
+
"model": {
|
107 |
+
"type": "BPE",
|
108 |
+
"dropout": null,
|
109 |
+
"unk_token": "<unk>",
|
110 |
+
"continuing_subword_prefix": null,
|
111 |
+
"end_of_word_suffix": null,
|
112 |
+
"fuse_unk": false,
|
113 |
+
"byte_fallback": false,
|
114 |
+
"ignore_merges": false,
|
115 |
+
"vocab": {
|
116 |
+
"<pad>": 0,
|
117 |
+
"<s>": 1,
|
118 |
+
"</s>": 2,
|
119 |
+
"<unk>": 3,
|
120 |
+
"!": 4,
|
121 |
+
"&": 5,
|
122 |
+
"'": 6,
|
123 |
+
"(": 7,
|
124 |
+
")": 8,
|
125 |
+
"*": 9,
|
126 |
+
",": 10,
|
127 |
+
"-": 11,
|
128 |
+
".": 12,
|
129 |
+
"0": 13,
|
130 |
+
"1": 14,
|
131 |
+
"2": 15,
|
132 |
+
"3": 16,
|
133 |
+
"4": 17,
|
134 |
+
"5": 18,
|
135 |
+
"6": 19,
|
136 |
+
"7": 20,
|
137 |
+
"8": 21,
|
138 |
+
"9": 22,
|
139 |
+
":": 23,
|
140 |
+
";": 24,
|
141 |
+
"=": 25,
|
142 |
+
">": 26,
|
143 |
+
"?": 27,
|
144 |
+
"A": 28,
|
145 |
+
"B": 29,
|
146 |
+
"C": 30,
|
147 |
+
"D": 31,
|
148 |
+
"E": 32,
|
149 |
+
"F": 33,
|
150 |
+
"G": 34,
|
151 |
+
"H": 35,
|
152 |
+
"I": 36,
|
153 |
+
"J": 37,
|
154 |
+
"K": 38,
|
155 |
+
"L": 39,
|
156 |
+
"M": 40,
|
157 |
+
"N": 41,
|
158 |
+
"O": 42,
|
159 |
+
"P": 43,
|
160 |
+
"Q": 44,
|
161 |
+
"R": 45,
|
162 |
+
"S": 46,
|
163 |
+
"T": 47,
|
164 |
+
"U": 48,
|
165 |
+
"V": 49,
|
166 |
+
"W": 50,
|
167 |
+
"X": 51,
|
168 |
+
"Y": 52,
|
169 |
+
"Z": 53,
|
170 |
+
"[": 54,
|
171 |
+
"a": 55,
|
172 |
+
"b": 56,
|
173 |
+
"c": 57,
|
174 |
+
"d": 58,
|
175 |
+
"e": 59,
|
176 |
+
"f": 60,
|
177 |
+
"g": 61,
|
178 |
+
"h": 62,
|
179 |
+
"i": 63,
|
180 |
+
"j": 64,
|
181 |
+
"k": 65,
|
182 |
+
"l": 66,
|
183 |
+
"m": 67,
|
184 |
+
"n": 68,
|
185 |
+
"o": 69,
|
186 |
+
"p": 70,
|
187 |
+
"q": 71,
|
188 |
+
"r": 72,
|
189 |
+
"s": 73,
|
190 |
+
"t": 74,
|
191 |
+
"u": 75,
|
192 |
+
"v": 76,
|
193 |
+
"w": 77,
|
194 |
+
"x": 78,
|
195 |
+
"y": 79,
|
196 |
+
"z": 80,
|
197 |
+
"«": 81,
|
198 |
+
"°": 82,
|
199 |
+
"»": 83,
|
200 |
+
"½": 84,
|
201 |
+
"Ä": 85,
|
202 |
+
"Ç": 86,
|
203 |
+
"É": 87,
|
204 |
+
"Ö": 88,
|
205 |
+
"Ü": 89,
|
206 |
+
"ß": 90,
|
207 |
+
"à": 91,
|
208 |
+
"â": 92,
|
209 |
+
"ä": 93,
|
210 |
+
"ç": 94,
|
211 |
+
"è": 95,
|
212 |
+
"é": 96,
|
213 |
+
"ê": 97,
|
214 |
+
"ë": 98,
|
215 |
+
"î": 99,
|
216 |
+
"ñ": 100,
|
217 |
+
"ô": 101,
|
218 |
+
"ö": 102,
|
219 |
+
"û": 103,
|
220 |
+
"ü": 104,
|
221 |
+
"ē": 105,
|
222 |
+
"ĕ": 106,
|
223 |
+
"‒": 107,
|
224 |
+
"–": 108,
|
225 |
+
"—": 109,
|
226 |
+
"‘": 110,
|
227 |
+
"’": 111,
|
228 |
+
"‚": 112,
|
229 |
+
"‛": 113,
|
230 |
+
"“": 114,
|
231 |
+
"”": 115,
|
232 |
+
"„": 116,
|
233 |
+
"″": 117,
|
234 |
+
"⁊": 118,
|
235 |
+
"▁": 119,
|
236 |
+
"░": 120,
|
237 |
+
"ꝛ": 121
|
238 |
+
},
|
239 |
+
"merges": []
|
240 |
+
}
|
241 |
+
}
|
tokenizer_config.json
ADDED
@@ -0,0 +1,53 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"added_tokens_decoder": {
|
3 |
+
"0": {
|
4 |
+
"content": "<pad>",
|
5 |
+
"lstrip": false,
|
6 |
+
"normalized": false,
|
7 |
+
"rstrip": false,
|
8 |
+
"single_word": false,
|
9 |
+
"special": true
|
10 |
+
},
|
11 |
+
"1": {
|
12 |
+
"content": "<s>",
|
13 |
+
"lstrip": false,
|
14 |
+
"normalized": false,
|
15 |
+
"rstrip": false,
|
16 |
+
"single_word": false,
|
17 |
+
"special": true
|
18 |
+
},
|
19 |
+
"2": {
|
20 |
+
"content": "</s>",
|
21 |
+
"lstrip": false,
|
22 |
+
"normalized": false,
|
23 |
+
"rstrip": false,
|
24 |
+
"single_word": false,
|
25 |
+
"special": true
|
26 |
+
},
|
27 |
+
"3": {
|
28 |
+
"content": "<unk>",
|
29 |
+
"lstrip": false,
|
30 |
+
"normalized": false,
|
31 |
+
"rstrip": false,
|
32 |
+
"single_word": false,
|
33 |
+
"special": true
|
34 |
+
}
|
35 |
+
},
|
36 |
+
"bos_token": "<s>",
|
37 |
+
"clean_up_tokenization_spaces": true,
|
38 |
+
"cls_token": "<s>",
|
39 |
+
"eos_token": "</s>",
|
40 |
+
"max_length": null,
|
41 |
+
"model_input_names": [
|
42 |
+
"input_ids",
|
43 |
+
"attention_mask"
|
44 |
+
],
|
45 |
+
"model_max_length": 100,
|
46 |
+
"pad_to_multiple_of": null,
|
47 |
+
"pad_token": "<pad>",
|
48 |
+
"pad_token_type_id": 0,
|
49 |
+
"padding_side": "right",
|
50 |
+
"sep_token": "</s>",
|
51 |
+
"tokenizer_class": "PreTrainedTokenizerFast",
|
52 |
+
"unk_token": "<unk>"
|
53 |
+
}
|
train_results.json
ADDED
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"epoch": 20.0,
|
3 |
+
"total_flos": 1486519029399552.0,
|
4 |
+
"train_loss": 0.04684994914012881,
|
5 |
+
"train_runtime": 19459.2196,
|
6 |
+
"train_samples": 101024,
|
7 |
+
"train_samples_per_second": 103.832,
|
8 |
+
"train_steps_per_second": 12.979
|
9 |
+
}
|
trainer_state.json
ADDED
@@ -0,0 +1,3055 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"best_metric": 0.90963293,
|
3 |
+
"best_model_checkpoint": "/volume/output/run2/checkpoint-252560",
|
4 |
+
"epoch": 20.0,
|
5 |
+
"eval_steps": 500,
|
6 |
+
"global_step": 252560,
|
7 |
+
"is_hyper_param_search": false,
|
8 |
+
"is_local_process_zero": true,
|
9 |
+
"is_world_process_zero": true,
|
10 |
+
"log_history": [
|
11 |
+
{
|
12 |
+
"epoch": 0.05004751346214761,
|
13 |
+
"grad_norm": 1.831141710281372,
|
14 |
+
"learning_rate": 9.974976243268926e-05,
|
15 |
+
"loss": 1.5344,
|
16 |
+
"step": 632
|
17 |
+
},
|
18 |
+
{
|
19 |
+
"epoch": 0.10009502692429521,
|
20 |
+
"grad_norm": 3.449512004852295,
|
21 |
+
"learning_rate": 9.949952486537853e-05,
|
22 |
+
"loss": 0.3225,
|
23 |
+
"step": 1264
|
24 |
+
},
|
25 |
+
{
|
26 |
+
"epoch": 0.1501425403864428,
|
27 |
+
"grad_norm": 1.8513766527175903,
|
28 |
+
"learning_rate": 9.92492872980678e-05,
|
29 |
+
"loss": 0.2466,
|
30 |
+
"step": 1896
|
31 |
+
},
|
32 |
+
{
|
33 |
+
"epoch": 0.20019005384859043,
|
34 |
+
"grad_norm": 1.0473196506500244,
|
35 |
+
"learning_rate": 9.899904973075705e-05,
|
36 |
+
"loss": 0.2176,
|
37 |
+
"step": 2528
|
38 |
+
},
|
39 |
+
{
|
40 |
+
"epoch": 0.25023756731073804,
|
41 |
+
"grad_norm": 1.3224372863769531,
|
42 |
+
"learning_rate": 9.874881216344632e-05,
|
43 |
+
"loss": 0.1852,
|
44 |
+
"step": 3160
|
45 |
+
},
|
46 |
+
{
|
47 |
+
"epoch": 0.3002850807728856,
|
48 |
+
"grad_norm": 1.1163212060928345,
|
49 |
+
"learning_rate": 9.849857459613557e-05,
|
50 |
+
"loss": 0.1725,
|
51 |
+
"step": 3792
|
52 |
+
},
|
53 |
+
{
|
54 |
+
"epoch": 0.35033259423503327,
|
55 |
+
"grad_norm": 0.2766059339046478,
|
56 |
+
"learning_rate": 9.824833702882485e-05,
|
57 |
+
"loss": 0.1613,
|
58 |
+
"step": 4424
|
59 |
+
},
|
60 |
+
{
|
61 |
+
"epoch": 0.40038010769718085,
|
62 |
+
"grad_norm": 1.0185670852661133,
|
63 |
+
"learning_rate": 9.79980994615141e-05,
|
64 |
+
"loss": 0.1345,
|
65 |
+
"step": 5056
|
66 |
+
},
|
67 |
+
{
|
68 |
+
"epoch": 0.4504276211593285,
|
69 |
+
"grad_norm": 1.164562702178955,
|
70 |
+
"learning_rate": 9.774786189420336e-05,
|
71 |
+
"loss": 0.1364,
|
72 |
+
"step": 5688
|
73 |
+
},
|
74 |
+
{
|
75 |
+
"epoch": 0.5004751346214761,
|
76 |
+
"grad_norm": 1.1677042245864868,
|
77 |
+
"learning_rate": 9.749762432689263e-05,
|
78 |
+
"loss": 0.1277,
|
79 |
+
"step": 6320
|
80 |
+
},
|
81 |
+
{
|
82 |
+
"epoch": 0.5505226480836237,
|
83 |
+
"grad_norm": 3.836665153503418,
|
84 |
+
"learning_rate": 9.724738675958188e-05,
|
85 |
+
"loss": 0.1254,
|
86 |
+
"step": 6952
|
87 |
+
},
|
88 |
+
{
|
89 |
+
"epoch": 0.6005701615457713,
|
90 |
+
"grad_norm": 2.3849477767944336,
|
91 |
+
"learning_rate": 9.699714919227115e-05,
|
92 |
+
"loss": 0.1188,
|
93 |
+
"step": 7584
|
94 |
+
},
|
95 |
+
{
|
96 |
+
"epoch": 0.6506176750079189,
|
97 |
+
"grad_norm": 3.4638051986694336,
|
98 |
+
"learning_rate": 9.674691162496041e-05,
|
99 |
+
"loss": 0.1159,
|
100 |
+
"step": 8216
|
101 |
+
},
|
102 |
+
{
|
103 |
+
"epoch": 0.7006651884700665,
|
104 |
+
"grad_norm": 0.2078377604484558,
|
105 |
+
"learning_rate": 9.649667405764967e-05,
|
106 |
+
"loss": 0.1097,
|
107 |
+
"step": 8848
|
108 |
+
},
|
109 |
+
{
|
110 |
+
"epoch": 0.7507127019322142,
|
111 |
+
"grad_norm": 2.513908624649048,
|
112 |
+
"learning_rate": 9.624643649033894e-05,
|
113 |
+
"loss": 0.1073,
|
114 |
+
"step": 9480
|
115 |
+
},
|
116 |
+
{
|
117 |
+
"epoch": 0.8007602153943617,
|
118 |
+
"grad_norm": 1.9936473369598389,
|
119 |
+
"learning_rate": 9.599619892302819e-05,
|
120 |
+
"loss": 0.1006,
|
121 |
+
"step": 10112
|
122 |
+
},
|
123 |
+
{
|
124 |
+
"epoch": 0.8508077288565093,
|
125 |
+
"grad_norm": 1.9626480340957642,
|
126 |
+
"learning_rate": 9.574596135571746e-05,
|
127 |
+
"loss": 0.1019,
|
128 |
+
"step": 10744
|
129 |
+
},
|
130 |
+
{
|
131 |
+
"epoch": 0.900855242318657,
|
132 |
+
"grad_norm": 1.7610087394714355,
|
133 |
+
"learning_rate": 9.549572378840672e-05,
|
134 |
+
"loss": 0.0898,
|
135 |
+
"step": 11376
|
136 |
+
},
|
137 |
+
{
|
138 |
+
"epoch": 0.9509027557808045,
|
139 |
+
"grad_norm": 1.6743286848068237,
|
140 |
+
"learning_rate": 9.524548622109598e-05,
|
141 |
+
"loss": 0.0912,
|
142 |
+
"step": 12008
|
143 |
+
},
|
144 |
+
{
|
145 |
+
"epoch": 1.0,
|
146 |
+
"eval_gen_len": 12.34557138,
|
147 |
+
"eval_loss": 0.06976257264614105,
|
148 |
+
"eval_runtime": 538.4523,
|
149 |
+
"eval_samples_per_second": 98.843,
|
150 |
+
"eval_steps_per_second": 1.545,
|
151 |
+
"eval_wordacc": 0.8983691,
|
152 |
+
"eval_wordacc_oov": 0.84212982,
|
153 |
+
"step": 12628
|
154 |
+
},
|
155 |
+
{
|
156 |
+
"epoch": 1.0009502692429522,
|
157 |
+
"grad_norm": 0.813258707523346,
|
158 |
+
"learning_rate": 9.499524865378525e-05,
|
159 |
+
"loss": 0.0908,
|
160 |
+
"step": 12640
|
161 |
+
},
|
162 |
+
{
|
163 |
+
"epoch": 1.0509977827050998,
|
164 |
+
"grad_norm": 1.4410091638565063,
|
165 |
+
"learning_rate": 9.474501108647451e-05,
|
166 |
+
"loss": 0.0864,
|
167 |
+
"step": 13272
|
168 |
+
},
|
169 |
+
{
|
170 |
+
"epoch": 1.1010452961672474,
|
171 |
+
"grad_norm": 0.1239413172006607,
|
172 |
+
"learning_rate": 9.449477351916377e-05,
|
173 |
+
"loss": 0.0819,
|
174 |
+
"step": 13904
|
175 |
+
},
|
176 |
+
{
|
177 |
+
"epoch": 1.151092809629395,
|
178 |
+
"grad_norm": 0.41668111085891724,
|
179 |
+
"learning_rate": 9.424453595185304e-05,
|
180 |
+
"loss": 0.0872,
|
181 |
+
"step": 14536
|
182 |
+
},
|
183 |
+
{
|
184 |
+
"epoch": 1.2011403230915425,
|
185 |
+
"grad_norm": 0.2707739472389221,
|
186 |
+
"learning_rate": 9.399429838454229e-05,
|
187 |
+
"loss": 0.082,
|
188 |
+
"step": 15168
|
189 |
+
},
|
190 |
+
{
|
191 |
+
"epoch": 1.2511878365536901,
|
192 |
+
"grad_norm": 0.5811319351196289,
|
193 |
+
"learning_rate": 9.374406081723154e-05,
|
194 |
+
"loss": 0.0847,
|
195 |
+
"step": 15800
|
196 |
+
},
|
197 |
+
{
|
198 |
+
"epoch": 1.3012353500158378,
|
199 |
+
"grad_norm": 1.7169886827468872,
|
200 |
+
"learning_rate": 9.349382324992082e-05,
|
201 |
+
"loss": 0.0817,
|
202 |
+
"step": 16432
|
203 |
+
},
|
204 |
+
{
|
205 |
+
"epoch": 1.3512828634779854,
|
206 |
+
"grad_norm": 0.11871356517076492,
|
207 |
+
"learning_rate": 9.324358568261008e-05,
|
208 |
+
"loss": 0.0884,
|
209 |
+
"step": 17064
|
210 |
+
},
|
211 |
+
{
|
212 |
+
"epoch": 1.401330376940133,
|
213 |
+
"grad_norm": 1.614180326461792,
|
214 |
+
"learning_rate": 9.299334811529935e-05,
|
215 |
+
"loss": 0.0781,
|
216 |
+
"step": 17696
|
217 |
+
},
|
218 |
+
{
|
219 |
+
"epoch": 1.4513778904022807,
|
220 |
+
"grad_norm": 0.5827309489250183,
|
221 |
+
"learning_rate": 9.27431105479886e-05,
|
222 |
+
"loss": 0.0722,
|
223 |
+
"step": 18328
|
224 |
+
},
|
225 |
+
{
|
226 |
+
"epoch": 1.5014254038644284,
|
227 |
+
"grad_norm": 0.09628592431545258,
|
228 |
+
"learning_rate": 9.249287298067785e-05,
|
229 |
+
"loss": 0.0815,
|
230 |
+
"step": 18960
|
231 |
+
},
|
232 |
+
{
|
233 |
+
"epoch": 1.551472917326576,
|
234 |
+
"grad_norm": 1.4028997421264648,
|
235 |
+
"learning_rate": 9.224263541336713e-05,
|
236 |
+
"loss": 0.0758,
|
237 |
+
"step": 19592
|
238 |
+
},
|
239 |
+
{
|
240 |
+
"epoch": 1.6015204307887236,
|
241 |
+
"grad_norm": 0.37902089953422546,
|
242 |
+
"learning_rate": 9.199239784605639e-05,
|
243 |
+
"loss": 0.0772,
|
244 |
+
"step": 20224
|
245 |
+
},
|
246 |
+
{
|
247 |
+
"epoch": 1.651567944250871,
|
248 |
+
"grad_norm": 0.5193475484848022,
|
249 |
+
"learning_rate": 9.174216027874564e-05,
|
250 |
+
"loss": 0.0731,
|
251 |
+
"step": 20856
|
252 |
+
},
|
253 |
+
{
|
254 |
+
"epoch": 1.7016154577130187,
|
255 |
+
"grad_norm": 1.5854244232177734,
|
256 |
+
"learning_rate": 9.149192271143491e-05,
|
257 |
+
"loss": 0.0704,
|
258 |
+
"step": 21488
|
259 |
+
},
|
260 |
+
{
|
261 |
+
"epoch": 1.7516629711751663,
|
262 |
+
"grad_norm": 2.4236505031585693,
|
263 |
+
"learning_rate": 9.124168514412418e-05,
|
264 |
+
"loss": 0.0733,
|
265 |
+
"step": 22120
|
266 |
+
},
|
267 |
+
{
|
268 |
+
"epoch": 1.8017104846373138,
|
269 |
+
"grad_norm": 0.610543429851532,
|
270 |
+
"learning_rate": 9.099144757681343e-05,
|
271 |
+
"loss": 0.0712,
|
272 |
+
"step": 22752
|
273 |
+
},
|
274 |
+
{
|
275 |
+
"epoch": 1.8517579980994614,
|
276 |
+
"grad_norm": 0.8024447560310364,
|
277 |
+
"learning_rate": 9.07412100095027e-05,
|
278 |
+
"loss": 0.0758,
|
279 |
+
"step": 23384
|
280 |
+
},
|
281 |
+
{
|
282 |
+
"epoch": 1.901805511561609,
|
283 |
+
"grad_norm": 0.1649412214756012,
|
284 |
+
"learning_rate": 9.049097244219195e-05,
|
285 |
+
"loss": 0.0736,
|
286 |
+
"step": 24016
|
287 |
+
},
|
288 |
+
{
|
289 |
+
"epoch": 1.9518530250237567,
|
290 |
+
"grad_norm": 0.15900301933288574,
|
291 |
+
"learning_rate": 9.024073487488122e-05,
|
292 |
+
"loss": 0.0746,
|
293 |
+
"step": 24648
|
294 |
+
},
|
295 |
+
{
|
296 |
+
"epoch": 2.0,
|
297 |
+
"eval_gen_len": 12.34418098,
|
298 |
+
"eval_loss": 0.057007092982530594,
|
299 |
+
"eval_runtime": 547.57,
|
300 |
+
"eval_samples_per_second": 97.197,
|
301 |
+
"eval_steps_per_second": 1.519,
|
302 |
+
"eval_wordacc": 0.91236707,
|
303 |
+
"eval_wordacc_oov": 0.85835753,
|
304 |
+
"step": 25256
|
305 |
+
},
|
306 |
+
{
|
307 |
+
"epoch": 2.0019005384859043,
|
308 |
+
"grad_norm": 1.2868945598602295,
|
309 |
+
"learning_rate": 8.999049730757049e-05,
|
310 |
+
"loss": 0.0676,
|
311 |
+
"step": 25280
|
312 |
+
},
|
313 |
+
{
|
314 |
+
"epoch": 2.051948051948052,
|
315 |
+
"grad_norm": 0.5984334945678711,
|
316 |
+
"learning_rate": 8.974025974025974e-05,
|
317 |
+
"loss": 0.0627,
|
318 |
+
"step": 25912
|
319 |
+
},
|
320 |
+
{
|
321 |
+
"epoch": 2.1019955654101996,
|
322 |
+
"grad_norm": 0.043390534818172455,
|
323 |
+
"learning_rate": 8.949002217294901e-05,
|
324 |
+
"loss": 0.0583,
|
325 |
+
"step": 26544
|
326 |
+
},
|
327 |
+
{
|
328 |
+
"epoch": 2.1520430788723472,
|
329 |
+
"grad_norm": 0.544601321220398,
|
330 |
+
"learning_rate": 8.923978460563826e-05,
|
331 |
+
"loss": 0.0614,
|
332 |
+
"step": 27176
|
333 |
+
},
|
334 |
+
{
|
335 |
+
"epoch": 2.202090592334495,
|
336 |
+
"grad_norm": 0.22555088996887207,
|
337 |
+
"learning_rate": 8.898954703832753e-05,
|
338 |
+
"loss": 0.0713,
|
339 |
+
"step": 27808
|
340 |
+
},
|
341 |
+
{
|
342 |
+
"epoch": 2.2521381057966425,
|
343 |
+
"grad_norm": 2.1508424282073975,
|
344 |
+
"learning_rate": 8.87393094710168e-05,
|
345 |
+
"loss": 0.071,
|
346 |
+
"step": 28440
|
347 |
+
},
|
348 |
+
{
|
349 |
+
"epoch": 2.30218561925879,
|
350 |
+
"grad_norm": 0.2992984354496002,
|
351 |
+
"learning_rate": 8.848907190370605e-05,
|
352 |
+
"loss": 0.0655,
|
353 |
+
"step": 29072
|
354 |
+
},
|
355 |
+
{
|
356 |
+
"epoch": 2.3522331327209374,
|
357 |
+
"grad_norm": 1.4744491577148438,
|
358 |
+
"learning_rate": 8.823883433639532e-05,
|
359 |
+
"loss": 0.0697,
|
360 |
+
"step": 29704
|
361 |
+
},
|
362 |
+
{
|
363 |
+
"epoch": 2.402280646183085,
|
364 |
+
"grad_norm": 0.6834865808486938,
|
365 |
+
"learning_rate": 8.798859676908457e-05,
|
366 |
+
"loss": 0.0661,
|
367 |
+
"step": 30336
|
368 |
+
},
|
369 |
+
{
|
370 |
+
"epoch": 2.4523281596452327,
|
371 |
+
"grad_norm": 0.06805714964866638,
|
372 |
+
"learning_rate": 8.773835920177384e-05,
|
373 |
+
"loss": 0.0672,
|
374 |
+
"step": 30968
|
375 |
+
},
|
376 |
+
{
|
377 |
+
"epoch": 2.5023756731073803,
|
378 |
+
"grad_norm": 1.880346655845642,
|
379 |
+
"learning_rate": 8.748812163446311e-05,
|
380 |
+
"loss": 0.063,
|
381 |
+
"step": 31600
|
382 |
+
},
|
383 |
+
{
|
384 |
+
"epoch": 2.552423186569528,
|
385 |
+
"grad_norm": 0.5247331857681274,
|
386 |
+
"learning_rate": 8.723788406715236e-05,
|
387 |
+
"loss": 0.0621,
|
388 |
+
"step": 32232
|
389 |
+
},
|
390 |
+
{
|
391 |
+
"epoch": 2.6024707000316756,
|
392 |
+
"grad_norm": 0.15831807255744934,
|
393 |
+
"learning_rate": 8.698764649984163e-05,
|
394 |
+
"loss": 0.0653,
|
395 |
+
"step": 32864
|
396 |
+
},
|
397 |
+
{
|
398 |
+
"epoch": 2.652518213493823,
|
399 |
+
"grad_norm": 0.04121825844049454,
|
400 |
+
"learning_rate": 8.673740893253088e-05,
|
401 |
+
"loss": 0.0642,
|
402 |
+
"step": 33496
|
403 |
+
},
|
404 |
+
{
|
405 |
+
"epoch": 2.702565726955971,
|
406 |
+
"grad_norm": 0.3549499809741974,
|
407 |
+
"learning_rate": 8.648717136522015e-05,
|
408 |
+
"loss": 0.0679,
|
409 |
+
"step": 34128
|
410 |
+
},
|
411 |
+
{
|
412 |
+
"epoch": 2.7526132404181185,
|
413 |
+
"grad_norm": 0.16000640392303467,
|
414 |
+
"learning_rate": 8.623693379790942e-05,
|
415 |
+
"loss": 0.0597,
|
416 |
+
"step": 34760
|
417 |
+
},
|
418 |
+
{
|
419 |
+
"epoch": 2.802660753880266,
|
420 |
+
"grad_norm": 0.6360165476799011,
|
421 |
+
"learning_rate": 8.598669623059867e-05,
|
422 |
+
"loss": 0.0666,
|
423 |
+
"step": 35392
|
424 |
+
},
|
425 |
+
{
|
426 |
+
"epoch": 2.852708267342414,
|
427 |
+
"grad_norm": 0.4808698296546936,
|
428 |
+
"learning_rate": 8.573645866328793e-05,
|
429 |
+
"loss": 0.0654,
|
430 |
+
"step": 36024
|
431 |
+
},
|
432 |
+
{
|
433 |
+
"epoch": 2.9027557808045614,
|
434 |
+
"grad_norm": 1.7070688009262085,
|
435 |
+
"learning_rate": 8.54862210959772e-05,
|
436 |
+
"loss": 0.0645,
|
437 |
+
"step": 36656
|
438 |
+
},
|
439 |
+
{
|
440 |
+
"epoch": 2.952803294266709,
|
441 |
+
"grad_norm": 0.3017909824848175,
|
442 |
+
"learning_rate": 8.523598352866646e-05,
|
443 |
+
"loss": 0.0622,
|
444 |
+
"step": 37288
|
445 |
+
},
|
446 |
+
{
|
447 |
+
"epoch": 3.0,
|
448 |
+
"eval_gen_len": 12.35115178,
|
449 |
+
"eval_loss": 0.04925922676920891,
|
450 |
+
"eval_runtime": 519.5392,
|
451 |
+
"eval_samples_per_second": 102.441,
|
452 |
+
"eval_steps_per_second": 1.601,
|
453 |
+
"eval_wordacc": 0.9194506,
|
454 |
+
"eval_wordacc_oov": 0.87168187,
|
455 |
+
"step": 37884
|
456 |
+
},
|
457 |
+
{
|
458 |
+
"epoch": 3.0028508077288567,
|
459 |
+
"grad_norm": 0.5821070671081543,
|
460 |
+
"learning_rate": 8.498574596135573e-05,
|
461 |
+
"loss": 0.0661,
|
462 |
+
"step": 37920
|
463 |
+
},
|
464 |
+
{
|
465 |
+
"epoch": 3.052898321191004,
|
466 |
+
"grad_norm": 0.4475654363632202,
|
467 |
+
"learning_rate": 8.473550839404498e-05,
|
468 |
+
"loss": 0.0574,
|
469 |
+
"step": 38552
|
470 |
+
},
|
471 |
+
{
|
472 |
+
"epoch": 3.1029458346531515,
|
473 |
+
"grad_norm": 1.4051363468170166,
|
474 |
+
"learning_rate": 8.448527082673424e-05,
|
475 |
+
"loss": 0.0614,
|
476 |
+
"step": 39184
|
477 |
+
},
|
478 |
+
{
|
479 |
+
"epoch": 3.152993348115299,
|
480 |
+
"grad_norm": 0.5270406603813171,
|
481 |
+
"learning_rate": 8.423503325942352e-05,
|
482 |
+
"loss": 0.059,
|
483 |
+
"step": 39816
|
484 |
+
},
|
485 |
+
{
|
486 |
+
"epoch": 3.203040861577447,
|
487 |
+
"grad_norm": 0.2516399621963501,
|
488 |
+
"learning_rate": 8.398479569211277e-05,
|
489 |
+
"loss": 0.0568,
|
490 |
+
"step": 40448
|
491 |
+
},
|
492 |
+
{
|
493 |
+
"epoch": 3.2530883750395945,
|
494 |
+
"grad_norm": 2.4771833419799805,
|
495 |
+
"learning_rate": 8.373455812480203e-05,
|
496 |
+
"loss": 0.055,
|
497 |
+
"step": 41080
|
498 |
+
},
|
499 |
+
{
|
500 |
+
"epoch": 3.303135888501742,
|
501 |
+
"grad_norm": 0.39709779620170593,
|
502 |
+
"learning_rate": 8.348432055749129e-05,
|
503 |
+
"loss": 0.0591,
|
504 |
+
"step": 41712
|
505 |
+
},
|
506 |
+
{
|
507 |
+
"epoch": 3.3531834019638898,
|
508 |
+
"grad_norm": 0.2029147893190384,
|
509 |
+
"learning_rate": 8.323408299018055e-05,
|
510 |
+
"loss": 0.0572,
|
511 |
+
"step": 42344
|
512 |
+
},
|
513 |
+
{
|
514 |
+
"epoch": 3.4032309154260374,
|
515 |
+
"grad_norm": 0.1706971675157547,
|
516 |
+
"learning_rate": 8.298384542286983e-05,
|
517 |
+
"loss": 0.0618,
|
518 |
+
"step": 42976
|
519 |
+
},
|
520 |
+
{
|
521 |
+
"epoch": 3.453278428888185,
|
522 |
+
"grad_norm": 0.14079004526138306,
|
523 |
+
"learning_rate": 8.273360785555908e-05,
|
524 |
+
"loss": 0.0583,
|
525 |
+
"step": 43608
|
526 |
+
},
|
527 |
+
{
|
528 |
+
"epoch": 3.5033259423503327,
|
529 |
+
"grad_norm": 0.8446473479270935,
|
530 |
+
"learning_rate": 8.248337028824834e-05,
|
531 |
+
"loss": 0.0582,
|
532 |
+
"step": 44240
|
533 |
+
},
|
534 |
+
{
|
535 |
+
"epoch": 3.5533734558124803,
|
536 |
+
"grad_norm": 1.3202866315841675,
|
537 |
+
"learning_rate": 8.22331327209376e-05,
|
538 |
+
"loss": 0.0594,
|
539 |
+
"step": 44872
|
540 |
+
},
|
541 |
+
{
|
542 |
+
"epoch": 3.603420969274628,
|
543 |
+
"grad_norm": 0.1289588212966919,
|
544 |
+
"learning_rate": 8.198289515362686e-05,
|
545 |
+
"loss": 0.0668,
|
546 |
+
"step": 45504
|
547 |
+
},
|
548 |
+
{
|
549 |
+
"epoch": 3.653468482736775,
|
550 |
+
"grad_norm": 0.7715129852294922,
|
551 |
+
"learning_rate": 8.173265758631612e-05,
|
552 |
+
"loss": 0.056,
|
553 |
+
"step": 46136
|
554 |
+
},
|
555 |
+
{
|
556 |
+
"epoch": 3.703515996198923,
|
557 |
+
"grad_norm": 0.8882943391799927,
|
558 |
+
"learning_rate": 8.148242001900539e-05,
|
559 |
+
"loss": 0.0585,
|
560 |
+
"step": 46768
|
561 |
+
},
|
562 |
+
{
|
563 |
+
"epoch": 3.7535635096610704,
|
564 |
+
"grad_norm": 0.5238478183746338,
|
565 |
+
"learning_rate": 8.123218245169465e-05,
|
566 |
+
"loss": 0.0523,
|
567 |
+
"step": 47400
|
568 |
+
},
|
569 |
+
{
|
570 |
+
"epoch": 3.803611023123218,
|
571 |
+
"grad_norm": 0.7119426727294922,
|
572 |
+
"learning_rate": 8.098194488438391e-05,
|
573 |
+
"loss": 0.0566,
|
574 |
+
"step": 48032
|
575 |
+
},
|
576 |
+
{
|
577 |
+
"epoch": 3.8536585365853657,
|
578 |
+
"grad_norm": 1.7843942642211914,
|
579 |
+
"learning_rate": 8.073170731707318e-05,
|
580 |
+
"loss": 0.0577,
|
581 |
+
"step": 48664
|
582 |
+
},
|
583 |
+
{
|
584 |
+
"epoch": 3.9037060500475134,
|
585 |
+
"grad_norm": 0.09263037890195847,
|
586 |
+
"learning_rate": 8.048146974976244e-05,
|
587 |
+
"loss": 0.0602,
|
588 |
+
"step": 49296
|
589 |
+
},
|
590 |
+
{
|
591 |
+
"epoch": 3.953753563509661,
|
592 |
+
"grad_norm": 0.8220856785774231,
|
593 |
+
"learning_rate": 8.02312321824517e-05,
|
594 |
+
"loss": 0.0584,
|
595 |
+
"step": 49928
|
596 |
+
},
|
597 |
+
{
|
598 |
+
"epoch": 4.0,
|
599 |
+
"eval_gen_len": 12.34404945,
|
600 |
+
"eval_loss": 0.04651999473571777,
|
601 |
+
"eval_runtime": 525.6595,
|
602 |
+
"eval_samples_per_second": 101.248,
|
603 |
+
"eval_steps_per_second": 1.583,
|
604 |
+
"eval_wordacc": 0.92211867,
|
605 |
+
"eval_wordacc_oov": 0.87494815,
|
606 |
+
"step": 50512
|
607 |
+
},
|
608 |
+
{
|
609 |
+
"epoch": 4.003801076971809,
|
610 |
+
"grad_norm": 0.09420084208250046,
|
611 |
+
"learning_rate": 7.998099461514096e-05,
|
612 |
+
"loss": 0.0559,
|
613 |
+
"step": 50560
|
614 |
+
},
|
615 |
+
{
|
616 |
+
"epoch": 4.053848590433956,
|
617 |
+
"grad_norm": 0.6216241121292114,
|
618 |
+
"learning_rate": 7.973075704783022e-05,
|
619 |
+
"loss": 0.0507,
|
620 |
+
"step": 51192
|
621 |
+
},
|
622 |
+
{
|
623 |
+
"epoch": 4.103896103896104,
|
624 |
+
"grad_norm": 0.6072413921356201,
|
625 |
+
"learning_rate": 7.948051948051949e-05,
|
626 |
+
"loss": 0.0508,
|
627 |
+
"step": 51824
|
628 |
+
},
|
629 |
+
{
|
630 |
+
"epoch": 4.153943617358252,
|
631 |
+
"grad_norm": 0.3261624872684479,
|
632 |
+
"learning_rate": 7.923028191320875e-05,
|
633 |
+
"loss": 0.0584,
|
634 |
+
"step": 52456
|
635 |
+
},
|
636 |
+
{
|
637 |
+
"epoch": 4.203991130820399,
|
638 |
+
"grad_norm": 3.996793508529663,
|
639 |
+
"learning_rate": 7.898004434589801e-05,
|
640 |
+
"loss": 0.0542,
|
641 |
+
"step": 53088
|
642 |
+
},
|
643 |
+
{
|
644 |
+
"epoch": 4.254038644282547,
|
645 |
+
"grad_norm": 0.09758679568767548,
|
646 |
+
"learning_rate": 7.872980677858727e-05,
|
647 |
+
"loss": 0.0543,
|
648 |
+
"step": 53720
|
649 |
+
},
|
650 |
+
{
|
651 |
+
"epoch": 4.3040861577446945,
|
652 |
+
"grad_norm": 0.8341479897499084,
|
653 |
+
"learning_rate": 7.847956921127652e-05,
|
654 |
+
"loss": 0.06,
|
655 |
+
"step": 54352
|
656 |
+
},
|
657 |
+
{
|
658 |
+
"epoch": 4.354133671206842,
|
659 |
+
"grad_norm": 0.3256041407585144,
|
660 |
+
"learning_rate": 7.82293316439658e-05,
|
661 |
+
"loss": 0.0547,
|
662 |
+
"step": 54984
|
663 |
+
},
|
664 |
+
{
|
665 |
+
"epoch": 4.40418118466899,
|
666 |
+
"grad_norm": 0.8251773715019226,
|
667 |
+
"learning_rate": 7.797909407665506e-05,
|
668 |
+
"loss": 0.0563,
|
669 |
+
"step": 55616
|
670 |
+
},
|
671 |
+
{
|
672 |
+
"epoch": 4.454228698131137,
|
673 |
+
"grad_norm": 0.23588858544826508,
|
674 |
+
"learning_rate": 7.772885650934432e-05,
|
675 |
+
"loss": 0.0566,
|
676 |
+
"step": 56248
|
677 |
+
},
|
678 |
+
{
|
679 |
+
"epoch": 4.504276211593285,
|
680 |
+
"grad_norm": 0.8622148633003235,
|
681 |
+
"learning_rate": 7.747861894203358e-05,
|
682 |
+
"loss": 0.0511,
|
683 |
+
"step": 56880
|
684 |
+
},
|
685 |
+
{
|
686 |
+
"epoch": 4.554323725055433,
|
687 |
+
"grad_norm": 0.29321447014808655,
|
688 |
+
"learning_rate": 7.722838137472284e-05,
|
689 |
+
"loss": 0.0502,
|
690 |
+
"step": 57512
|
691 |
+
},
|
692 |
+
{
|
693 |
+
"epoch": 4.60437123851758,
|
694 |
+
"grad_norm": 1.1889938116073608,
|
695 |
+
"learning_rate": 7.697814380741211e-05,
|
696 |
+
"loss": 0.0525,
|
697 |
+
"step": 58144
|
698 |
+
},
|
699 |
+
{
|
700 |
+
"epoch": 4.654418751979728,
|
701 |
+
"grad_norm": 0.5421351790428162,
|
702 |
+
"learning_rate": 7.672790624010137e-05,
|
703 |
+
"loss": 0.0509,
|
704 |
+
"step": 58776
|
705 |
+
},
|
706 |
+
{
|
707 |
+
"epoch": 4.704466265441875,
|
708 |
+
"grad_norm": 0.4371638000011444,
|
709 |
+
"learning_rate": 7.647766867279062e-05,
|
710 |
+
"loss": 0.0548,
|
711 |
+
"step": 59408
|
712 |
+
},
|
713 |
+
{
|
714 |
+
"epoch": 4.754513778904023,
|
715 |
+
"grad_norm": 0.06439998745918274,
|
716 |
+
"learning_rate": 7.622743110547989e-05,
|
717 |
+
"loss": 0.0553,
|
718 |
+
"step": 60040
|
719 |
+
},
|
720 |
+
{
|
721 |
+
"epoch": 4.80456129236617,
|
722 |
+
"grad_norm": 0.8337986469268799,
|
723 |
+
"learning_rate": 7.597719353816916e-05,
|
724 |
+
"loss": 0.0539,
|
725 |
+
"step": 60672
|
726 |
+
},
|
727 |
+
{
|
728 |
+
"epoch": 4.854608805828318,
|
729 |
+
"grad_norm": 0.5551128387451172,
|
730 |
+
"learning_rate": 7.572695597085841e-05,
|
731 |
+
"loss": 0.0531,
|
732 |
+
"step": 61304
|
733 |
+
},
|
734 |
+
{
|
735 |
+
"epoch": 4.904656319290465,
|
736 |
+
"grad_norm": 0.842311680316925,
|
737 |
+
"learning_rate": 7.547671840354768e-05,
|
738 |
+
"loss": 0.053,
|
739 |
+
"step": 61936
|
740 |
+
},
|
741 |
+
{
|
742 |
+
"epoch": 4.954703832752613,
|
743 |
+
"grad_norm": 0.6434153318405151,
|
744 |
+
"learning_rate": 7.522648083623693e-05,
|
745 |
+
"loss": 0.0497,
|
746 |
+
"step": 62568
|
747 |
+
},
|
748 |
+
{
|
749 |
+
"epoch": 5.0,
|
750 |
+
"eval_gen_len": 12.35519146,
|
751 |
+
"eval_loss": 0.04361514747142792,
|
752 |
+
"eval_runtime": 525.1326,
|
753 |
+
"eval_samples_per_second": 101.35,
|
754 |
+
"eval_steps_per_second": 1.584,
|
755 |
+
"eval_wordacc": 0.92741723,
|
756 |
+
"eval_wordacc_oov": 0.88205102,
|
757 |
+
"step": 63140
|
758 |
+
},
|
759 |
+
{
|
760 |
+
"epoch": 5.004751346214761,
|
761 |
+
"grad_norm": 0.11094748228788376,
|
762 |
+
"learning_rate": 7.49762432689262e-05,
|
763 |
+
"loss": 0.0509,
|
764 |
+
"step": 63200
|
765 |
+
},
|
766 |
+
{
|
767 |
+
"epoch": 5.054798859676908,
|
768 |
+
"grad_norm": 0.46878868341445923,
|
769 |
+
"learning_rate": 7.472600570161547e-05,
|
770 |
+
"loss": 0.0445,
|
771 |
+
"step": 63832
|
772 |
+
},
|
773 |
+
{
|
774 |
+
"epoch": 5.104846373139056,
|
775 |
+
"grad_norm": 1.7899694442749023,
|
776 |
+
"learning_rate": 7.447576813430472e-05,
|
777 |
+
"loss": 0.045,
|
778 |
+
"step": 64464
|
779 |
+
},
|
780 |
+
{
|
781 |
+
"epoch": 5.1548938866012035,
|
782 |
+
"grad_norm": 0.03743477538228035,
|
783 |
+
"learning_rate": 7.422553056699399e-05,
|
784 |
+
"loss": 0.0538,
|
785 |
+
"step": 65096
|
786 |
+
},
|
787 |
+
{
|
788 |
+
"epoch": 5.204941400063351,
|
789 |
+
"grad_norm": 0.9636221528053284,
|
790 |
+
"learning_rate": 7.397529299968324e-05,
|
791 |
+
"loss": 0.0444,
|
792 |
+
"step": 65728
|
793 |
+
},
|
794 |
+
{
|
795 |
+
"epoch": 5.254988913525499,
|
796 |
+
"grad_norm": 0.1024821326136589,
|
797 |
+
"learning_rate": 7.372505543237251e-05,
|
798 |
+
"loss": 0.0543,
|
799 |
+
"step": 66360
|
800 |
+
},
|
801 |
+
{
|
802 |
+
"epoch": 5.305036426987646,
|
803 |
+
"grad_norm": 0.6220707297325134,
|
804 |
+
"learning_rate": 7.347481786506178e-05,
|
805 |
+
"loss": 0.0537,
|
806 |
+
"step": 66992
|
807 |
+
},
|
808 |
+
{
|
809 |
+
"epoch": 5.355083940449794,
|
810 |
+
"grad_norm": 0.13747639954090118,
|
811 |
+
"learning_rate": 7.322458029775103e-05,
|
812 |
+
"loss": 0.0443,
|
813 |
+
"step": 67624
|
814 |
+
},
|
815 |
+
{
|
816 |
+
"epoch": 5.405131453911942,
|
817 |
+
"grad_norm": 0.25481894612312317,
|
818 |
+
"learning_rate": 7.29743427304403e-05,
|
819 |
+
"loss": 0.0544,
|
820 |
+
"step": 68256
|
821 |
+
},
|
822 |
+
{
|
823 |
+
"epoch": 5.455178967374089,
|
824 |
+
"grad_norm": 0.40640395879745483,
|
825 |
+
"learning_rate": 7.272410516312955e-05,
|
826 |
+
"loss": 0.0501,
|
827 |
+
"step": 68888
|
828 |
+
},
|
829 |
+
{
|
830 |
+
"epoch": 5.505226480836237,
|
831 |
+
"grad_norm": 0.017994888126850128,
|
832 |
+
"learning_rate": 7.247386759581882e-05,
|
833 |
+
"loss": 0.0497,
|
834 |
+
"step": 69520
|
835 |
+
},
|
836 |
+
{
|
837 |
+
"epoch": 5.555273994298385,
|
838 |
+
"grad_norm": 0.7399475574493408,
|
839 |
+
"learning_rate": 7.222363002850809e-05,
|
840 |
+
"loss": 0.0503,
|
841 |
+
"step": 70152
|
842 |
+
},
|
843 |
+
{
|
844 |
+
"epoch": 5.605321507760532,
|
845 |
+
"grad_norm": 0.1965421885251999,
|
846 |
+
"learning_rate": 7.197339246119734e-05,
|
847 |
+
"loss": 0.0494,
|
848 |
+
"step": 70784
|
849 |
+
},
|
850 |
+
{
|
851 |
+
"epoch": 5.65536902122268,
|
852 |
+
"grad_norm": 0.603735625743866,
|
853 |
+
"learning_rate": 7.172315489388661e-05,
|
854 |
+
"loss": 0.0492,
|
855 |
+
"step": 71416
|
856 |
+
},
|
857 |
+
{
|
858 |
+
"epoch": 5.705416534684828,
|
859 |
+
"grad_norm": 0.4098168909549713,
|
860 |
+
"learning_rate": 7.147291732657586e-05,
|
861 |
+
"loss": 0.0496,
|
862 |
+
"step": 72048
|
863 |
+
},
|
864 |
+
{
|
865 |
+
"epoch": 5.755464048146975,
|
866 |
+
"grad_norm": 0.026117555797100067,
|
867 |
+
"learning_rate": 7.122267975926513e-05,
|
868 |
+
"loss": 0.05,
|
869 |
+
"step": 72680
|
870 |
+
},
|
871 |
+
{
|
872 |
+
"epoch": 5.805511561609123,
|
873 |
+
"grad_norm": 0.03871222585439682,
|
874 |
+
"learning_rate": 7.09724421919544e-05,
|
875 |
+
"loss": 0.0508,
|
876 |
+
"step": 73312
|
877 |
+
},
|
878 |
+
{
|
879 |
+
"epoch": 5.8555590750712705,
|
880 |
+
"grad_norm": 1.100329041481018,
|
881 |
+
"learning_rate": 7.072220462464365e-05,
|
882 |
+
"loss": 0.0503,
|
883 |
+
"step": 73944
|
884 |
+
},
|
885 |
+
{
|
886 |
+
"epoch": 5.905606588533418,
|
887 |
+
"grad_norm": 0.8121901154518127,
|
888 |
+
"learning_rate": 7.04719670573329e-05,
|
889 |
+
"loss": 0.0468,
|
890 |
+
"step": 74576
|
891 |
+
},
|
892 |
+
{
|
893 |
+
"epoch": 5.955654101995566,
|
894 |
+
"grad_norm": 0.6457042694091797,
|
895 |
+
"learning_rate": 7.022172949002219e-05,
|
896 |
+
"loss": 0.0502,
|
897 |
+
"step": 75208
|
898 |
+
},
|
899 |
+
{
|
900 |
+
"epoch": 6.0,
|
901 |
+
"eval_gen_len": 12.35192214,
|
902 |
+
"eval_loss": 0.041068777441978455,
|
903 |
+
"eval_runtime": 522.1947,
|
904 |
+
"eval_samples_per_second": 101.92,
|
905 |
+
"eval_steps_per_second": 1.593,
|
906 |
+
"eval_wordacc": 0.93108113,
|
907 |
+
"eval_wordacc_oov": 0.88583575,
|
908 |
+
"step": 75768
|
909 |
+
},
|
910 |
+
{
|
911 |
+
"epoch": 6.005701615457713,
|
912 |
+
"grad_norm": 0.20237529277801514,
|
913 |
+
"learning_rate": 6.997149192271144e-05,
|
914 |
+
"loss": 0.0496,
|
915 |
+
"step": 75840
|
916 |
+
},
|
917 |
+
{
|
918 |
+
"epoch": 6.055749128919861,
|
919 |
+
"grad_norm": 0.10558341443538666,
|
920 |
+
"learning_rate": 6.97212543554007e-05,
|
921 |
+
"loss": 0.0422,
|
922 |
+
"step": 76472
|
923 |
+
},
|
924 |
+
{
|
925 |
+
"epoch": 6.105796642382008,
|
926 |
+
"grad_norm": 0.38815170526504517,
|
927 |
+
"learning_rate": 6.947101678808996e-05,
|
928 |
+
"loss": 0.0468,
|
929 |
+
"step": 77104
|
930 |
+
},
|
931 |
+
{
|
932 |
+
"epoch": 6.1558441558441555,
|
933 |
+
"grad_norm": 0.042554233223199844,
|
934 |
+
"learning_rate": 6.922077922077921e-05,
|
935 |
+
"loss": 0.0446,
|
936 |
+
"step": 77736
|
937 |
+
},
|
938 |
+
{
|
939 |
+
"epoch": 6.205891669306303,
|
940 |
+
"grad_norm": 1.2894216775894165,
|
941 |
+
"learning_rate": 6.89705416534685e-05,
|
942 |
+
"loss": 0.0462,
|
943 |
+
"step": 78368
|
944 |
+
},
|
945 |
+
{
|
946 |
+
"epoch": 6.255939182768451,
|
947 |
+
"grad_norm": 0.7259889841079712,
|
948 |
+
"learning_rate": 6.872030408615775e-05,
|
949 |
+
"loss": 0.045,
|
950 |
+
"step": 79000
|
951 |
+
},
|
952 |
+
{
|
953 |
+
"epoch": 6.305986696230598,
|
954 |
+
"grad_norm": 0.5286532044410706,
|
955 |
+
"learning_rate": 6.8470066518847e-05,
|
956 |
+
"loss": 0.0446,
|
957 |
+
"step": 79632
|
958 |
+
},
|
959 |
+
{
|
960 |
+
"epoch": 6.356034209692746,
|
961 |
+
"grad_norm": 0.20671215653419495,
|
962 |
+
"learning_rate": 6.821982895153627e-05,
|
963 |
+
"loss": 0.0442,
|
964 |
+
"step": 80264
|
965 |
+
},
|
966 |
+
{
|
967 |
+
"epoch": 6.406081723154894,
|
968 |
+
"grad_norm": 0.2522045969963074,
|
969 |
+
"learning_rate": 6.796959138422552e-05,
|
970 |
+
"loss": 0.0455,
|
971 |
+
"step": 80896
|
972 |
+
},
|
973 |
+
{
|
974 |
+
"epoch": 6.456129236617041,
|
975 |
+
"grad_norm": 0.09235669672489166,
|
976 |
+
"learning_rate": 6.771935381691479e-05,
|
977 |
+
"loss": 0.0463,
|
978 |
+
"step": 81528
|
979 |
+
},
|
980 |
+
{
|
981 |
+
"epoch": 6.506176750079189,
|
982 |
+
"grad_norm": 1.3369249105453491,
|
983 |
+
"learning_rate": 6.746911624960406e-05,
|
984 |
+
"loss": 0.0467,
|
985 |
+
"step": 82160
|
986 |
+
},
|
987 |
+
{
|
988 |
+
"epoch": 6.556224263541337,
|
989 |
+
"grad_norm": 0.1361207365989685,
|
990 |
+
"learning_rate": 6.721887868229331e-05,
|
991 |
+
"loss": 0.0464,
|
992 |
+
"step": 82792
|
993 |
+
},
|
994 |
+
{
|
995 |
+
"epoch": 6.606271777003484,
|
996 |
+
"grad_norm": 0.057785116136074066,
|
997 |
+
"learning_rate": 6.696864111498258e-05,
|
998 |
+
"loss": 0.0471,
|
999 |
+
"step": 83424
|
1000 |
+
},
|
1001 |
+
{
|
1002 |
+
"epoch": 6.656319290465632,
|
1003 |
+
"grad_norm": 0.5493625998497009,
|
1004 |
+
"learning_rate": 6.671840354767185e-05,
|
1005 |
+
"loss": 0.0493,
|
1006 |
+
"step": 84056
|
1007 |
+
},
|
1008 |
+
{
|
1009 |
+
"epoch": 6.7063668039277795,
|
1010 |
+
"grad_norm": 0.08289259672164917,
|
1011 |
+
"learning_rate": 6.64681659803611e-05,
|
1012 |
+
"loss": 0.0491,
|
1013 |
+
"step": 84688
|
1014 |
+
},
|
1015 |
+
{
|
1016 |
+
"epoch": 6.756414317389927,
|
1017 |
+
"grad_norm": 0.08197712898254395,
|
1018 |
+
"learning_rate": 6.621792841305037e-05,
|
1019 |
+
"loss": 0.0487,
|
1020 |
+
"step": 85320
|
1021 |
+
},
|
1022 |
+
{
|
1023 |
+
"epoch": 6.806461830852075,
|
1024 |
+
"grad_norm": 0.1574297547340393,
|
1025 |
+
"learning_rate": 6.596769084573962e-05,
|
1026 |
+
"loss": 0.042,
|
1027 |
+
"step": 85952
|
1028 |
+
},
|
1029 |
+
{
|
1030 |
+
"epoch": 6.856509344314222,
|
1031 |
+
"grad_norm": 0.07021531462669373,
|
1032 |
+
"learning_rate": 6.571745327842889e-05,
|
1033 |
+
"loss": 0.0469,
|
1034 |
+
"step": 86584
|
1035 |
+
},
|
1036 |
+
{
|
1037 |
+
"epoch": 6.90655685777637,
|
1038 |
+
"grad_norm": 0.5342025756835938,
|
1039 |
+
"learning_rate": 6.546721571111816e-05,
|
1040 |
+
"loss": 0.0458,
|
1041 |
+
"step": 87216
|
1042 |
+
},
|
1043 |
+
{
|
1044 |
+
"epoch": 6.956604371238518,
|
1045 |
+
"grad_norm": 0.754435658454895,
|
1046 |
+
"learning_rate": 6.521697814380741e-05,
|
1047 |
+
"loss": 0.0428,
|
1048 |
+
"step": 87848
|
1049 |
+
},
|
1050 |
+
{
|
1051 |
+
"epoch": 7.0,
|
1052 |
+
"eval_gen_len": 12.34435008,
|
1053 |
+
"eval_loss": 0.039582036435604095,
|
1054 |
+
"eval_runtime": 513.2393,
|
1055 |
+
"eval_samples_per_second": 103.698,
|
1056 |
+
"eval_steps_per_second": 1.621,
|
1057 |
+
"eval_wordacc": 0.93356131,
|
1058 |
+
"eval_wordacc_oov": 0.88780589,
|
1059 |
+
"step": 88396
|
1060 |
+
},
|
1061 |
+
{
|
1062 |
+
"epoch": 7.006651884700665,
|
1063 |
+
"grad_norm": 1.7553069591522217,
|
1064 |
+
"learning_rate": 6.496674057649668e-05,
|
1065 |
+
"loss": 0.0429,
|
1066 |
+
"step": 88480
|
1067 |
+
},
|
1068 |
+
{
|
1069 |
+
"epoch": 7.056699398162813,
|
1070 |
+
"grad_norm": 0.15994039177894592,
|
1071 |
+
"learning_rate": 6.471650300918593e-05,
|
1072 |
+
"loss": 0.0451,
|
1073 |
+
"step": 89112
|
1074 |
+
},
|
1075 |
+
{
|
1076 |
+
"epoch": 7.106746911624961,
|
1077 |
+
"grad_norm": 0.21786805987358093,
|
1078 |
+
"learning_rate": 6.44662654418752e-05,
|
1079 |
+
"loss": 0.0415,
|
1080 |
+
"step": 89744
|
1081 |
+
},
|
1082 |
+
{
|
1083 |
+
"epoch": 7.156794425087108,
|
1084 |
+
"grad_norm": 0.07405902445316315,
|
1085 |
+
"learning_rate": 6.421602787456447e-05,
|
1086 |
+
"loss": 0.0408,
|
1087 |
+
"step": 90376
|
1088 |
+
},
|
1089 |
+
{
|
1090 |
+
"epoch": 7.206841938549256,
|
1091 |
+
"grad_norm": 0.1853848397731781,
|
1092 |
+
"learning_rate": 6.396579030725372e-05,
|
1093 |
+
"loss": 0.0415,
|
1094 |
+
"step": 91008
|
1095 |
+
},
|
1096 |
+
{
|
1097 |
+
"epoch": 7.256889452011404,
|
1098 |
+
"grad_norm": 0.41366642713546753,
|
1099 |
+
"learning_rate": 6.371555273994299e-05,
|
1100 |
+
"loss": 0.046,
|
1101 |
+
"step": 91640
|
1102 |
+
},
|
1103 |
+
{
|
1104 |
+
"epoch": 7.306936965473551,
|
1105 |
+
"grad_norm": 0.2615118622779846,
|
1106 |
+
"learning_rate": 6.346531517263224e-05,
|
1107 |
+
"loss": 0.0413,
|
1108 |
+
"step": 92272
|
1109 |
+
},
|
1110 |
+
{
|
1111 |
+
"epoch": 7.356984478935699,
|
1112 |
+
"grad_norm": 0.06805741786956787,
|
1113 |
+
"learning_rate": 6.321507760532151e-05,
|
1114 |
+
"loss": 0.0436,
|
1115 |
+
"step": 92904
|
1116 |
+
},
|
1117 |
+
{
|
1118 |
+
"epoch": 7.407031992397846,
|
1119 |
+
"grad_norm": 1.3070762157440186,
|
1120 |
+
"learning_rate": 6.296484003801078e-05,
|
1121 |
+
"loss": 0.0464,
|
1122 |
+
"step": 93536
|
1123 |
+
},
|
1124 |
+
{
|
1125 |
+
"epoch": 7.457079505859994,
|
1126 |
+
"grad_norm": 0.11481507122516632,
|
1127 |
+
"learning_rate": 6.271460247070003e-05,
|
1128 |
+
"loss": 0.0442,
|
1129 |
+
"step": 94168
|
1130 |
+
},
|
1131 |
+
{
|
1132 |
+
"epoch": 7.507127019322141,
|
1133 |
+
"grad_norm": 0.9575181603431702,
|
1134 |
+
"learning_rate": 6.246436490338929e-05,
|
1135 |
+
"loss": 0.043,
|
1136 |
+
"step": 94800
|
1137 |
+
},
|
1138 |
+
{
|
1139 |
+
"epoch": 7.5571745327842885,
|
1140 |
+
"grad_norm": 0.2559140920639038,
|
1141 |
+
"learning_rate": 6.221412733607856e-05,
|
1142 |
+
"loss": 0.0375,
|
1143 |
+
"step": 95432
|
1144 |
+
},
|
1145 |
+
{
|
1146 |
+
"epoch": 7.607222046246436,
|
1147 |
+
"grad_norm": 0.043967317789793015,
|
1148 |
+
"learning_rate": 6.196388976876782e-05,
|
1149 |
+
"loss": 0.0412,
|
1150 |
+
"step": 96064
|
1151 |
+
},
|
1152 |
+
{
|
1153 |
+
"epoch": 7.657269559708584,
|
1154 |
+
"grad_norm": 0.5306654572486877,
|
1155 |
+
"learning_rate": 6.171365220145709e-05,
|
1156 |
+
"loss": 0.0449,
|
1157 |
+
"step": 96696
|
1158 |
+
},
|
1159 |
+
{
|
1160 |
+
"epoch": 7.7073170731707314,
|
1161 |
+
"grad_norm": 0.03713352233171463,
|
1162 |
+
"learning_rate": 6.146341463414634e-05,
|
1163 |
+
"loss": 0.0443,
|
1164 |
+
"step": 97328
|
1165 |
+
},
|
1166 |
+
{
|
1167 |
+
"epoch": 7.757364586632879,
|
1168 |
+
"grad_norm": 0.6375567317008972,
|
1169 |
+
"learning_rate": 6.12131770668356e-05,
|
1170 |
+
"loss": 0.0441,
|
1171 |
+
"step": 97960
|
1172 |
+
},
|
1173 |
+
{
|
1174 |
+
"epoch": 7.807412100095027,
|
1175 |
+
"grad_norm": 0.9398515224456787,
|
1176 |
+
"learning_rate": 6.096293949952487e-05,
|
1177 |
+
"loss": 0.0421,
|
1178 |
+
"step": 98592
|
1179 |
+
},
|
1180 |
+
{
|
1181 |
+
"epoch": 7.857459613557174,
|
1182 |
+
"grad_norm": 0.6615290641784668,
|
1183 |
+
"learning_rate": 6.071270193221413e-05,
|
1184 |
+
"loss": 0.0468,
|
1185 |
+
"step": 99224
|
1186 |
+
},
|
1187 |
+
{
|
1188 |
+
"epoch": 7.907507127019322,
|
1189 |
+
"grad_norm": 0.48468518257141113,
|
1190 |
+
"learning_rate": 6.0462464364903394e-05,
|
1191 |
+
"loss": 0.0427,
|
1192 |
+
"step": 99856
|
1193 |
+
},
|
1194 |
+
{
|
1195 |
+
"epoch": 7.95755464048147,
|
1196 |
+
"grad_norm": 0.967910647392273,
|
1197 |
+
"learning_rate": 6.0212226797592654e-05,
|
1198 |
+
"loss": 0.0416,
|
1199 |
+
"step": 100488
|
1200 |
+
},
|
1201 |
+
{
|
1202 |
+
"epoch": 8.0,
|
1203 |
+
"eval_gen_len": 12.34714967,
|
1204 |
+
"eval_loss": 0.0372321754693985,
|
1205 |
+
"eval_runtime": 517.1292,
|
1206 |
+
"eval_samples_per_second": 102.918,
|
1207 |
+
"eval_steps_per_second": 1.609,
|
1208 |
+
"eval_wordacc": 0.93393709,
|
1209 |
+
"eval_wordacc_oov": 0.88868727,
|
1210 |
+
"step": 101024
|
1211 |
+
},
|
1212 |
+
{
|
1213 |
+
"epoch": 8.007602153943617,
|
1214 |
+
"grad_norm": 0.5545419454574585,
|
1215 |
+
"learning_rate": 5.9961989230281915e-05,
|
1216 |
+
"loss": 0.0447,
|
1217 |
+
"step": 101120
|
1218 |
+
},
|
1219 |
+
{
|
1220 |
+
"epoch": 8.057649667405766,
|
1221 |
+
"grad_norm": 1.7096141576766968,
|
1222 |
+
"learning_rate": 5.971175166297118e-05,
|
1223 |
+
"loss": 0.038,
|
1224 |
+
"step": 101752
|
1225 |
+
},
|
1226 |
+
{
|
1227 |
+
"epoch": 8.107697180867913,
|
1228 |
+
"grad_norm": 0.06642602384090424,
|
1229 |
+
"learning_rate": 5.946151409566044e-05,
|
1230 |
+
"loss": 0.0395,
|
1231 |
+
"step": 102384
|
1232 |
+
},
|
1233 |
+
{
|
1234 |
+
"epoch": 8.15774469433006,
|
1235 |
+
"grad_norm": 0.15396763384342194,
|
1236 |
+
"learning_rate": 5.9211276528349704e-05,
|
1237 |
+
"loss": 0.0411,
|
1238 |
+
"step": 103016
|
1239 |
+
},
|
1240 |
+
{
|
1241 |
+
"epoch": 8.207792207792208,
|
1242 |
+
"grad_norm": 1.0655204057693481,
|
1243 |
+
"learning_rate": 5.8961038961038965e-05,
|
1244 |
+
"loss": 0.0416,
|
1245 |
+
"step": 103648
|
1246 |
+
},
|
1247 |
+
{
|
1248 |
+
"epoch": 8.257839721254355,
|
1249 |
+
"grad_norm": 0.42243492603302,
|
1250 |
+
"learning_rate": 5.871080139372822e-05,
|
1251 |
+
"loss": 0.0445,
|
1252 |
+
"step": 104280
|
1253 |
+
},
|
1254 |
+
{
|
1255 |
+
"epoch": 8.307887234716503,
|
1256 |
+
"grad_norm": 0.05212310701608658,
|
1257 |
+
"learning_rate": 5.846056382641749e-05,
|
1258 |
+
"loss": 0.0392,
|
1259 |
+
"step": 104912
|
1260 |
+
},
|
1261 |
+
{
|
1262 |
+
"epoch": 8.35793474817865,
|
1263 |
+
"grad_norm": 0.6314841508865356,
|
1264 |
+
"learning_rate": 5.8210326259106754e-05,
|
1265 |
+
"loss": 0.0392,
|
1266 |
+
"step": 105544
|
1267 |
+
},
|
1268 |
+
{
|
1269 |
+
"epoch": 8.407982261640798,
|
1270 |
+
"grad_norm": 0.4014628827571869,
|
1271 |
+
"learning_rate": 5.796008869179601e-05,
|
1272 |
+
"loss": 0.0387,
|
1273 |
+
"step": 106176
|
1274 |
+
},
|
1275 |
+
{
|
1276 |
+
"epoch": 8.458029775102945,
|
1277 |
+
"grad_norm": 0.13107645511627197,
|
1278 |
+
"learning_rate": 5.770985112448527e-05,
|
1279 |
+
"loss": 0.0425,
|
1280 |
+
"step": 106808
|
1281 |
+
},
|
1282 |
+
{
|
1283 |
+
"epoch": 8.508077288565094,
|
1284 |
+
"grad_norm": 0.1390117108821869,
|
1285 |
+
"learning_rate": 5.745961355717454e-05,
|
1286 |
+
"loss": 0.038,
|
1287 |
+
"step": 107440
|
1288 |
+
},
|
1289 |
+
{
|
1290 |
+
"epoch": 8.55812480202724,
|
1291 |
+
"grad_norm": 0.1780831664800644,
|
1292 |
+
"learning_rate": 5.72093759898638e-05,
|
1293 |
+
"loss": 0.0397,
|
1294 |
+
"step": 108072
|
1295 |
+
},
|
1296 |
+
{
|
1297 |
+
"epoch": 8.608172315489389,
|
1298 |
+
"grad_norm": 0.8216772675514221,
|
1299 |
+
"learning_rate": 5.695913842255306e-05,
|
1300 |
+
"loss": 0.0377,
|
1301 |
+
"step": 108704
|
1302 |
+
},
|
1303 |
+
{
|
1304 |
+
"epoch": 8.658219828951536,
|
1305 |
+
"grad_norm": 0.040023334324359894,
|
1306 |
+
"learning_rate": 5.670890085524232e-05,
|
1307 |
+
"loss": 0.0395,
|
1308 |
+
"step": 109336
|
1309 |
+
},
|
1310 |
+
{
|
1311 |
+
"epoch": 8.708267342413684,
|
1312 |
+
"grad_norm": 0.7334257364273071,
|
1313 |
+
"learning_rate": 5.645866328793158e-05,
|
1314 |
+
"loss": 0.0401,
|
1315 |
+
"step": 109968
|
1316 |
+
},
|
1317 |
+
{
|
1318 |
+
"epoch": 8.758314855875831,
|
1319 |
+
"grad_norm": 0.09213205426931381,
|
1320 |
+
"learning_rate": 5.620842572062085e-05,
|
1321 |
+
"loss": 0.038,
|
1322 |
+
"step": 110600
|
1323 |
+
},
|
1324 |
+
{
|
1325 |
+
"epoch": 8.80836236933798,
|
1326 |
+
"grad_norm": 0.8264344930648804,
|
1327 |
+
"learning_rate": 5.595818815331011e-05,
|
1328 |
+
"loss": 0.0404,
|
1329 |
+
"step": 111232
|
1330 |
+
},
|
1331 |
+
{
|
1332 |
+
"epoch": 8.858409882800126,
|
1333 |
+
"grad_norm": 0.4497428238391876,
|
1334 |
+
"learning_rate": 5.570795058599937e-05,
|
1335 |
+
"loss": 0.0417,
|
1336 |
+
"step": 111864
|
1337 |
+
},
|
1338 |
+
{
|
1339 |
+
"epoch": 8.908457396262275,
|
1340 |
+
"grad_norm": 0.02390374056994915,
|
1341 |
+
"learning_rate": 5.545771301868863e-05,
|
1342 |
+
"loss": 0.0351,
|
1343 |
+
"step": 112496
|
1344 |
+
},
|
1345 |
+
{
|
1346 |
+
"epoch": 8.958504909724422,
|
1347 |
+
"grad_norm": 0.1745648831129074,
|
1348 |
+
"learning_rate": 5.520747545137789e-05,
|
1349 |
+
"loss": 0.042,
|
1350 |
+
"step": 113128
|
1351 |
+
},
|
1352 |
+
{
|
1353 |
+
"epoch": 9.0,
|
1354 |
+
"eval_gen_len": 12.3485025,
|
1355 |
+
"eval_loss": 0.03648155927658081,
|
1356 |
+
"eval_runtime": 514.9505,
|
1357 |
+
"eval_samples_per_second": 103.354,
|
1358 |
+
"eval_steps_per_second": 1.616,
|
1359 |
+
"eval_wordacc": 0.93964902,
|
1360 |
+
"eval_wordacc_oov": 0.89444214,
|
1361 |
+
"step": 113652
|
1362 |
+
},
|
1363 |
+
{
|
1364 |
+
"epoch": 9.00855242318657,
|
1365 |
+
"grad_norm": 0.7463224530220032,
|
1366 |
+
"learning_rate": 5.4957237884067156e-05,
|
1367 |
+
"loss": 0.0393,
|
1368 |
+
"step": 113760
|
1369 |
+
},
|
1370 |
+
{
|
1371 |
+
"epoch": 9.058599936648717,
|
1372 |
+
"grad_norm": 0.2374447137117386,
|
1373 |
+
"learning_rate": 5.470700031675642e-05,
|
1374 |
+
"loss": 0.0349,
|
1375 |
+
"step": 114392
|
1376 |
+
},
|
1377 |
+
{
|
1378 |
+
"epoch": 9.108647450110865,
|
1379 |
+
"grad_norm": 1.1859426498413086,
|
1380 |
+
"learning_rate": 5.445676274944568e-05,
|
1381 |
+
"loss": 0.039,
|
1382 |
+
"step": 115024
|
1383 |
+
},
|
1384 |
+
{
|
1385 |
+
"epoch": 9.158694963573012,
|
1386 |
+
"grad_norm": 0.03181586042046547,
|
1387 |
+
"learning_rate": 5.420652518213494e-05,
|
1388 |
+
"loss": 0.0354,
|
1389 |
+
"step": 115656
|
1390 |
+
},
|
1391 |
+
{
|
1392 |
+
"epoch": 9.20874247703516,
|
1393 |
+
"grad_norm": 0.0614316463470459,
|
1394 |
+
"learning_rate": 5.3956287614824206e-05,
|
1395 |
+
"loss": 0.0337,
|
1396 |
+
"step": 116288
|
1397 |
+
},
|
1398 |
+
{
|
1399 |
+
"epoch": 9.258789990497307,
|
1400 |
+
"grad_norm": 0.031755685806274414,
|
1401 |
+
"learning_rate": 5.370605004751347e-05,
|
1402 |
+
"loss": 0.0346,
|
1403 |
+
"step": 116920
|
1404 |
+
},
|
1405 |
+
{
|
1406 |
+
"epoch": 9.308837503959456,
|
1407 |
+
"grad_norm": 0.4263891875743866,
|
1408 |
+
"learning_rate": 5.345581248020273e-05,
|
1409 |
+
"loss": 0.0365,
|
1410 |
+
"step": 117552
|
1411 |
+
},
|
1412 |
+
{
|
1413 |
+
"epoch": 9.358885017421603,
|
1414 |
+
"grad_norm": 0.7516904473304749,
|
1415 |
+
"learning_rate": 5.320557491289199e-05,
|
1416 |
+
"loss": 0.0394,
|
1417 |
+
"step": 118184
|
1418 |
+
},
|
1419 |
+
{
|
1420 |
+
"epoch": 9.408932530883751,
|
1421 |
+
"grad_norm": 0.547282338142395,
|
1422 |
+
"learning_rate": 5.295533734558125e-05,
|
1423 |
+
"loss": 0.037,
|
1424 |
+
"step": 118816
|
1425 |
+
},
|
1426 |
+
{
|
1427 |
+
"epoch": 9.458980044345898,
|
1428 |
+
"grad_norm": 0.048598043620586395,
|
1429 |
+
"learning_rate": 5.2705099778270516e-05,
|
1430 |
+
"loss": 0.0396,
|
1431 |
+
"step": 119448
|
1432 |
+
},
|
1433 |
+
{
|
1434 |
+
"epoch": 9.509027557808047,
|
1435 |
+
"grad_norm": 0.019015343859791756,
|
1436 |
+
"learning_rate": 5.245486221095978e-05,
|
1437 |
+
"loss": 0.0377,
|
1438 |
+
"step": 120080
|
1439 |
+
},
|
1440 |
+
{
|
1441 |
+
"epoch": 9.559075071270193,
|
1442 |
+
"grad_norm": 0.42539820075035095,
|
1443 |
+
"learning_rate": 5.220462464364904e-05,
|
1444 |
+
"loss": 0.0361,
|
1445 |
+
"step": 120712
|
1446 |
+
},
|
1447 |
+
{
|
1448 |
+
"epoch": 9.60912258473234,
|
1449 |
+
"grad_norm": 0.5751402974128723,
|
1450 |
+
"learning_rate": 5.19543870763383e-05,
|
1451 |
+
"loss": 0.0407,
|
1452 |
+
"step": 121344
|
1453 |
+
},
|
1454 |
+
{
|
1455 |
+
"epoch": 9.659170098194489,
|
1456 |
+
"grad_norm": 1.380823016166687,
|
1457 |
+
"learning_rate": 5.170414950902755e-05,
|
1458 |
+
"loss": 0.0388,
|
1459 |
+
"step": 121976
|
1460 |
+
},
|
1461 |
+
{
|
1462 |
+
"epoch": 9.709217611656635,
|
1463 |
+
"grad_norm": 0.6849233508110046,
|
1464 |
+
"learning_rate": 5.145391194171683e-05,
|
1465 |
+
"loss": 0.0387,
|
1466 |
+
"step": 122608
|
1467 |
+
},
|
1468 |
+
{
|
1469 |
+
"epoch": 9.759265125118784,
|
1470 |
+
"grad_norm": 1.4156357049942017,
|
1471 |
+
"learning_rate": 5.120367437440609e-05,
|
1472 |
+
"loss": 0.0373,
|
1473 |
+
"step": 123240
|
1474 |
+
},
|
1475 |
+
{
|
1476 |
+
"epoch": 9.80931263858093,
|
1477 |
+
"grad_norm": 0.18062171339988708,
|
1478 |
+
"learning_rate": 5.095343680709535e-05,
|
1479 |
+
"loss": 0.0333,
|
1480 |
+
"step": 123872
|
1481 |
+
},
|
1482 |
+
{
|
1483 |
+
"epoch": 9.85936015204308,
|
1484 |
+
"grad_norm": 0.6566870212554932,
|
1485 |
+
"learning_rate": 5.07031992397846e-05,
|
1486 |
+
"loss": 0.0367,
|
1487 |
+
"step": 124504
|
1488 |
+
},
|
1489 |
+
{
|
1490 |
+
"epoch": 9.909407665505226,
|
1491 |
+
"grad_norm": 0.2741030156612396,
|
1492 |
+
"learning_rate": 5.0452961672473876e-05,
|
1493 |
+
"loss": 0.0426,
|
1494 |
+
"step": 125136
|
1495 |
+
},
|
1496 |
+
{
|
1497 |
+
"epoch": 9.959455178967374,
|
1498 |
+
"grad_norm": 0.7864658236503601,
|
1499 |
+
"learning_rate": 5.020272410516314e-05,
|
1500 |
+
"loss": 0.0376,
|
1501 |
+
"step": 125768
|
1502 |
+
},
|
1503 |
+
{
|
1504 |
+
"epoch": 10.0,
|
1505 |
+
"eval_gen_len": 12.34852129,
|
1506 |
+
"eval_loss": 0.03532838076353073,
|
1507 |
+
"eval_runtime": 515.5211,
|
1508 |
+
"eval_samples_per_second": 103.239,
|
1509 |
+
"eval_steps_per_second": 1.614,
|
1510 |
+
"eval_wordacc": 0.94117094,
|
1511 |
+
"eval_wordacc_oov": 0.89620489,
|
1512 |
+
"step": 126280
|
1513 |
+
},
|
1514 |
+
{
|
1515 |
+
"epoch": 10.009502692429521,
|
1516 |
+
"grad_norm": 0.08346331119537354,
|
1517 |
+
"learning_rate": 4.995248653785239e-05,
|
1518 |
+
"loss": 0.0358,
|
1519 |
+
"step": 126400
|
1520 |
+
},
|
1521 |
+
{
|
1522 |
+
"epoch": 10.05955020589167,
|
1523 |
+
"grad_norm": 0.1552925556898117,
|
1524 |
+
"learning_rate": 4.970224897054165e-05,
|
1525 |
+
"loss": 0.0286,
|
1526 |
+
"step": 127032
|
1527 |
+
},
|
1528 |
+
{
|
1529 |
+
"epoch": 10.109597719353816,
|
1530 |
+
"grad_norm": 0.017832357436418533,
|
1531 |
+
"learning_rate": 4.945201140323092e-05,
|
1532 |
+
"loss": 0.0342,
|
1533 |
+
"step": 127664
|
1534 |
+
},
|
1535 |
+
{
|
1536 |
+
"epoch": 10.159645232815965,
|
1537 |
+
"grad_norm": 0.7822960019111633,
|
1538 |
+
"learning_rate": 4.920177383592018e-05,
|
1539 |
+
"loss": 0.0342,
|
1540 |
+
"step": 128296
|
1541 |
+
},
|
1542 |
+
{
|
1543 |
+
"epoch": 10.209692746278112,
|
1544 |
+
"grad_norm": 0.8483818173408508,
|
1545 |
+
"learning_rate": 4.895153626860944e-05,
|
1546 |
+
"loss": 0.036,
|
1547 |
+
"step": 128928
|
1548 |
+
},
|
1549 |
+
{
|
1550 |
+
"epoch": 10.25974025974026,
|
1551 |
+
"grad_norm": 0.12484422326087952,
|
1552 |
+
"learning_rate": 4.87012987012987e-05,
|
1553 |
+
"loss": 0.0353,
|
1554 |
+
"step": 129560
|
1555 |
+
},
|
1556 |
+
{
|
1557 |
+
"epoch": 10.309787773202407,
|
1558 |
+
"grad_norm": 1.0866436958312988,
|
1559 |
+
"learning_rate": 4.845106113398797e-05,
|
1560 |
+
"loss": 0.0336,
|
1561 |
+
"step": 130192
|
1562 |
+
},
|
1563 |
+
{
|
1564 |
+
"epoch": 10.359835286664556,
|
1565 |
+
"grad_norm": 2.065387487411499,
|
1566 |
+
"learning_rate": 4.820082356667723e-05,
|
1567 |
+
"loss": 0.036,
|
1568 |
+
"step": 130824
|
1569 |
+
},
|
1570 |
+
{
|
1571 |
+
"epoch": 10.409882800126702,
|
1572 |
+
"grad_norm": 0.13544411957263947,
|
1573 |
+
"learning_rate": 4.795058599936649e-05,
|
1574 |
+
"loss": 0.0353,
|
1575 |
+
"step": 131456
|
1576 |
+
},
|
1577 |
+
{
|
1578 |
+
"epoch": 10.45993031358885,
|
1579 |
+
"grad_norm": 0.8644410967826843,
|
1580 |
+
"learning_rate": 4.770034843205575e-05,
|
1581 |
+
"loss": 0.0366,
|
1582 |
+
"step": 132088
|
1583 |
+
},
|
1584 |
+
{
|
1585 |
+
"epoch": 10.509977827050998,
|
1586 |
+
"grad_norm": 0.044758204370737076,
|
1587 |
+
"learning_rate": 4.745011086474501e-05,
|
1588 |
+
"loss": 0.0337,
|
1589 |
+
"step": 132720
|
1590 |
+
},
|
1591 |
+
{
|
1592 |
+
"epoch": 10.560025340513146,
|
1593 |
+
"grad_norm": 0.07954395562410355,
|
1594 |
+
"learning_rate": 4.719987329743428e-05,
|
1595 |
+
"loss": 0.038,
|
1596 |
+
"step": 133352
|
1597 |
+
},
|
1598 |
+
{
|
1599 |
+
"epoch": 10.610072853975293,
|
1600 |
+
"grad_norm": 0.15263523161411285,
|
1601 |
+
"learning_rate": 4.694963573012354e-05,
|
1602 |
+
"loss": 0.037,
|
1603 |
+
"step": 133984
|
1604 |
+
},
|
1605 |
+
{
|
1606 |
+
"epoch": 10.660120367437441,
|
1607 |
+
"grad_norm": 0.4090266227722168,
|
1608 |
+
"learning_rate": 4.66993981628128e-05,
|
1609 |
+
"loss": 0.0341,
|
1610 |
+
"step": 134616
|
1611 |
+
},
|
1612 |
+
{
|
1613 |
+
"epoch": 10.710167880899588,
|
1614 |
+
"grad_norm": 0.3766542077064514,
|
1615 |
+
"learning_rate": 4.644916059550206e-05,
|
1616 |
+
"loss": 0.0395,
|
1617 |
+
"step": 135248
|
1618 |
+
},
|
1619 |
+
{
|
1620 |
+
"epoch": 10.760215394361737,
|
1621 |
+
"grad_norm": 0.044228482991456985,
|
1622 |
+
"learning_rate": 4.619892302819132e-05,
|
1623 |
+
"loss": 0.0354,
|
1624 |
+
"step": 135880
|
1625 |
+
},
|
1626 |
+
{
|
1627 |
+
"epoch": 10.810262907823883,
|
1628 |
+
"grad_norm": 3.4171650409698486,
|
1629 |
+
"learning_rate": 4.594868546088059e-05,
|
1630 |
+
"loss": 0.0344,
|
1631 |
+
"step": 136512
|
1632 |
+
},
|
1633 |
+
{
|
1634 |
+
"epoch": 10.86031042128603,
|
1635 |
+
"grad_norm": 0.1112111434340477,
|
1636 |
+
"learning_rate": 4.569844789356984e-05,
|
1637 |
+
"loss": 0.0361,
|
1638 |
+
"step": 137144
|
1639 |
+
},
|
1640 |
+
{
|
1641 |
+
"epoch": 10.910357934748179,
|
1642 |
+
"grad_norm": 0.09063247591257095,
|
1643 |
+
"learning_rate": 4.544821032625911e-05,
|
1644 |
+
"loss": 0.0325,
|
1645 |
+
"step": 137776
|
1646 |
+
},
|
1647 |
+
{
|
1648 |
+
"epoch": 10.960405448210325,
|
1649 |
+
"grad_norm": 0.2144654095172882,
|
1650 |
+
"learning_rate": 4.519797275894837e-05,
|
1651 |
+
"loss": 0.031,
|
1652 |
+
"step": 138408
|
1653 |
+
},
|
1654 |
+
{
|
1655 |
+
"epoch": 11.0,
|
1656 |
+
"eval_gen_len": 12.35186577,
|
1657 |
+
"eval_loss": 0.03388630226254463,
|
1658 |
+
"eval_runtime": 519.1861,
|
1659 |
+
"eval_samples_per_second": 102.51,
|
1660 |
+
"eval_steps_per_second": 1.603,
|
1661 |
+
"eval_wordacc": 0.94387659,
|
1662 |
+
"eval_wordacc_oov": 0.90081916,
|
1663 |
+
"step": 138908
|
1664 |
+
},
|
1665 |
+
{
|
1666 |
+
"epoch": 11.010452961672474,
|
1667 |
+
"grad_norm": 0.9230628609657288,
|
1668 |
+
"learning_rate": 4.494773519163763e-05,
|
1669 |
+
"loss": 0.0336,
|
1670 |
+
"step": 139040
|
1671 |
+
},
|
1672 |
+
{
|
1673 |
+
"epoch": 11.06050047513462,
|
1674 |
+
"grad_norm": 0.04931863397359848,
|
1675 |
+
"learning_rate": 4.469749762432689e-05,
|
1676 |
+
"loss": 0.0332,
|
1677 |
+
"step": 139672
|
1678 |
+
},
|
1679 |
+
{
|
1680 |
+
"epoch": 11.11054798859677,
|
1681 |
+
"grad_norm": 0.43372172117233276,
|
1682 |
+
"learning_rate": 4.4447260057016154e-05,
|
1683 |
+
"loss": 0.0303,
|
1684 |
+
"step": 140304
|
1685 |
+
},
|
1686 |
+
{
|
1687 |
+
"epoch": 11.160595502058916,
|
1688 |
+
"grad_norm": 1.8216339349746704,
|
1689 |
+
"learning_rate": 4.419702248970542e-05,
|
1690 |
+
"loss": 0.0327,
|
1691 |
+
"step": 140936
|
1692 |
+
},
|
1693 |
+
{
|
1694 |
+
"epoch": 11.210643015521065,
|
1695 |
+
"grad_norm": 0.4168206453323364,
|
1696 |
+
"learning_rate": 4.394678492239468e-05,
|
1697 |
+
"loss": 0.0299,
|
1698 |
+
"step": 141568
|
1699 |
+
},
|
1700 |
+
{
|
1701 |
+
"epoch": 11.260690528983211,
|
1702 |
+
"grad_norm": 0.2748374938964844,
|
1703 |
+
"learning_rate": 4.369654735508394e-05,
|
1704 |
+
"loss": 0.031,
|
1705 |
+
"step": 142200
|
1706 |
+
},
|
1707 |
+
{
|
1708 |
+
"epoch": 11.31073804244536,
|
1709 |
+
"grad_norm": 1.0721484422683716,
|
1710 |
+
"learning_rate": 4.34463097877732e-05,
|
1711 |
+
"loss": 0.0338,
|
1712 |
+
"step": 142832
|
1713 |
+
},
|
1714 |
+
{
|
1715 |
+
"epoch": 11.360785555907507,
|
1716 |
+
"grad_norm": 0.6880238056182861,
|
1717 |
+
"learning_rate": 4.319607222046247e-05,
|
1718 |
+
"loss": 0.0306,
|
1719 |
+
"step": 143464
|
1720 |
+
},
|
1721 |
+
{
|
1722 |
+
"epoch": 11.410833069369655,
|
1723 |
+
"grad_norm": 0.6363319158554077,
|
1724 |
+
"learning_rate": 4.294583465315173e-05,
|
1725 |
+
"loss": 0.0324,
|
1726 |
+
"step": 144096
|
1727 |
+
},
|
1728 |
+
{
|
1729 |
+
"epoch": 11.460880582831802,
|
1730 |
+
"grad_norm": 0.9793805480003357,
|
1731 |
+
"learning_rate": 4.2695597085840985e-05,
|
1732 |
+
"loss": 0.0352,
|
1733 |
+
"step": 144728
|
1734 |
+
},
|
1735 |
+
{
|
1736 |
+
"epoch": 11.51092809629395,
|
1737 |
+
"grad_norm": 0.05823446065187454,
|
1738 |
+
"learning_rate": 4.244535951853025e-05,
|
1739 |
+
"loss": 0.0302,
|
1740 |
+
"step": 145360
|
1741 |
+
},
|
1742 |
+
{
|
1743 |
+
"epoch": 11.560975609756097,
|
1744 |
+
"grad_norm": 0.44829338788986206,
|
1745 |
+
"learning_rate": 4.2195121951219514e-05,
|
1746 |
+
"loss": 0.0317,
|
1747 |
+
"step": 145992
|
1748 |
+
},
|
1749 |
+
{
|
1750 |
+
"epoch": 11.611023123218246,
|
1751 |
+
"grad_norm": 0.21787405014038086,
|
1752 |
+
"learning_rate": 4.194488438390878e-05,
|
1753 |
+
"loss": 0.0327,
|
1754 |
+
"step": 146624
|
1755 |
+
},
|
1756 |
+
{
|
1757 |
+
"epoch": 11.661070636680392,
|
1758 |
+
"grad_norm": 0.09327682852745056,
|
1759 |
+
"learning_rate": 4.1694646816598035e-05,
|
1760 |
+
"loss": 0.0343,
|
1761 |
+
"step": 147256
|
1762 |
+
},
|
1763 |
+
{
|
1764 |
+
"epoch": 11.711118150142541,
|
1765 |
+
"grad_norm": 0.4916613698005676,
|
1766 |
+
"learning_rate": 4.14444092492873e-05,
|
1767 |
+
"loss": 0.0358,
|
1768 |
+
"step": 147888
|
1769 |
+
},
|
1770 |
+
{
|
1771 |
+
"epoch": 11.761165663604688,
|
1772 |
+
"grad_norm": 0.04805804416537285,
|
1773 |
+
"learning_rate": 4.119417168197656e-05,
|
1774 |
+
"loss": 0.0343,
|
1775 |
+
"step": 148520
|
1776 |
+
},
|
1777 |
+
{
|
1778 |
+
"epoch": 11.811213177066836,
|
1779 |
+
"grad_norm": 0.019468722864985466,
|
1780 |
+
"learning_rate": 4.0943934114665824e-05,
|
1781 |
+
"loss": 0.0298,
|
1782 |
+
"step": 149152
|
1783 |
+
},
|
1784 |
+
{
|
1785 |
+
"epoch": 11.861260690528983,
|
1786 |
+
"grad_norm": 0.09931553900241852,
|
1787 |
+
"learning_rate": 4.0693696547355085e-05,
|
1788 |
+
"loss": 0.0304,
|
1789 |
+
"step": 149784
|
1790 |
+
},
|
1791 |
+
{
|
1792 |
+
"epoch": 11.911308203991132,
|
1793 |
+
"grad_norm": 0.03299326449632645,
|
1794 |
+
"learning_rate": 4.0443458980044345e-05,
|
1795 |
+
"loss": 0.0355,
|
1796 |
+
"step": 150416
|
1797 |
+
},
|
1798 |
+
{
|
1799 |
+
"epoch": 11.961355717453278,
|
1800 |
+
"grad_norm": 0.11701735109090805,
|
1801 |
+
"learning_rate": 4.019322141273361e-05,
|
1802 |
+
"loss": 0.0298,
|
1803 |
+
"step": 151048
|
1804 |
+
},
|
1805 |
+
{
|
1806 |
+
"epoch": 12.0,
|
1807 |
+
"eval_gen_len": 12.34792003,
|
1808 |
+
"eval_loss": 0.03365420550107956,
|
1809 |
+
"eval_runtime": 512.0738,
|
1810 |
+
"eval_samples_per_second": 103.934,
|
1811 |
+
"eval_steps_per_second": 1.625,
|
1812 |
+
"eval_wordacc": 0.94539852,
|
1813 |
+
"eval_wordacc_oov": 0.90128577,
|
1814 |
+
"step": 151536
|
1815 |
+
},
|
1816 |
+
{
|
1817 |
+
"epoch": 12.011403230915427,
|
1818 |
+
"grad_norm": 0.14153118431568146,
|
1819 |
+
"learning_rate": 3.9942983845422874e-05,
|
1820 |
+
"loss": 0.0294,
|
1821 |
+
"step": 151680
|
1822 |
+
},
|
1823 |
+
{
|
1824 |
+
"epoch": 12.061450744377574,
|
1825 |
+
"grad_norm": 0.8496013283729553,
|
1826 |
+
"learning_rate": 3.9692746278112134e-05,
|
1827 |
+
"loss": 0.0302,
|
1828 |
+
"step": 152312
|
1829 |
+
},
|
1830 |
+
{
|
1831 |
+
"epoch": 12.111498257839722,
|
1832 |
+
"grad_norm": 0.3620108664035797,
|
1833 |
+
"learning_rate": 3.9442508710801395e-05,
|
1834 |
+
"loss": 0.0293,
|
1835 |
+
"step": 152944
|
1836 |
+
},
|
1837 |
+
{
|
1838 |
+
"epoch": 12.161545771301869,
|
1839 |
+
"grad_norm": 0.8358076810836792,
|
1840 |
+
"learning_rate": 3.9192271143490656e-05,
|
1841 |
+
"loss": 0.0279,
|
1842 |
+
"step": 153576
|
1843 |
+
},
|
1844 |
+
{
|
1845 |
+
"epoch": 12.211593284764016,
|
1846 |
+
"grad_norm": 2.183274745941162,
|
1847 |
+
"learning_rate": 3.894203357617992e-05,
|
1848 |
+
"loss": 0.0298,
|
1849 |
+
"step": 154208
|
1850 |
+
},
|
1851 |
+
{
|
1852 |
+
"epoch": 12.261640798226164,
|
1853 |
+
"grad_norm": 0.11177387088537216,
|
1854 |
+
"learning_rate": 3.869179600886918e-05,
|
1855 |
+
"loss": 0.0279,
|
1856 |
+
"step": 154840
|
1857 |
+
},
|
1858 |
+
{
|
1859 |
+
"epoch": 12.311688311688311,
|
1860 |
+
"grad_norm": 0.16532179713249207,
|
1861 |
+
"learning_rate": 3.8441558441558445e-05,
|
1862 |
+
"loss": 0.0283,
|
1863 |
+
"step": 155472
|
1864 |
+
},
|
1865 |
+
{
|
1866 |
+
"epoch": 12.36173582515046,
|
1867 |
+
"grad_norm": 0.2667466104030609,
|
1868 |
+
"learning_rate": 3.8191320874247705e-05,
|
1869 |
+
"loss": 0.0324,
|
1870 |
+
"step": 156104
|
1871 |
+
},
|
1872 |
+
{
|
1873 |
+
"epoch": 12.411783338612606,
|
1874 |
+
"grad_norm": 1.4449559450149536,
|
1875 |
+
"learning_rate": 3.794108330693697e-05,
|
1876 |
+
"loss": 0.0327,
|
1877 |
+
"step": 156736
|
1878 |
+
},
|
1879 |
+
{
|
1880 |
+
"epoch": 12.461830852074755,
|
1881 |
+
"grad_norm": 1.9251255989074707,
|
1882 |
+
"learning_rate": 3.769084573962623e-05,
|
1883 |
+
"loss": 0.0293,
|
1884 |
+
"step": 157368
|
1885 |
+
},
|
1886 |
+
{
|
1887 |
+
"epoch": 12.511878365536901,
|
1888 |
+
"grad_norm": 1.981763243675232,
|
1889 |
+
"learning_rate": 3.744060817231549e-05,
|
1890 |
+
"loss": 0.0296,
|
1891 |
+
"step": 158000
|
1892 |
+
},
|
1893 |
+
{
|
1894 |
+
"epoch": 12.56192587899905,
|
1895 |
+
"grad_norm": 0.5154635906219482,
|
1896 |
+
"learning_rate": 3.7190370605004755e-05,
|
1897 |
+
"loss": 0.0326,
|
1898 |
+
"step": 158632
|
1899 |
+
},
|
1900 |
+
{
|
1901 |
+
"epoch": 12.611973392461197,
|
1902 |
+
"grad_norm": 0.7047850489616394,
|
1903 |
+
"learning_rate": 3.6940133037694016e-05,
|
1904 |
+
"loss": 0.0313,
|
1905 |
+
"step": 159264
|
1906 |
+
},
|
1907 |
+
{
|
1908 |
+
"epoch": 12.662020905923345,
|
1909 |
+
"grad_norm": 0.05169570446014404,
|
1910 |
+
"learning_rate": 3.6689895470383276e-05,
|
1911 |
+
"loss": 0.0311,
|
1912 |
+
"step": 159896
|
1913 |
+
},
|
1914 |
+
{
|
1915 |
+
"epoch": 12.712068419385492,
|
1916 |
+
"grad_norm": 0.3161393105983734,
|
1917 |
+
"learning_rate": 3.643965790307254e-05,
|
1918 |
+
"loss": 0.0295,
|
1919 |
+
"step": 160528
|
1920 |
+
},
|
1921 |
+
{
|
1922 |
+
"epoch": 12.76211593284764,
|
1923 |
+
"grad_norm": 0.4407559633255005,
|
1924 |
+
"learning_rate": 3.6189420335761805e-05,
|
1925 |
+
"loss": 0.0307,
|
1926 |
+
"step": 161160
|
1927 |
+
},
|
1928 |
+
{
|
1929 |
+
"epoch": 12.812163446309787,
|
1930 |
+
"grad_norm": 2.090928792953491,
|
1931 |
+
"learning_rate": 3.5939182768451065e-05,
|
1932 |
+
"loss": 0.0288,
|
1933 |
+
"step": 161792
|
1934 |
+
},
|
1935 |
+
{
|
1936 |
+
"epoch": 12.862210959771936,
|
1937 |
+
"grad_norm": 0.039744604378938675,
|
1938 |
+
"learning_rate": 3.568894520114032e-05,
|
1939 |
+
"loss": 0.0285,
|
1940 |
+
"step": 162424
|
1941 |
+
},
|
1942 |
+
{
|
1943 |
+
"epoch": 12.912258473234083,
|
1944 |
+
"grad_norm": 0.48494115471839905,
|
1945 |
+
"learning_rate": 3.543870763382959e-05,
|
1946 |
+
"loss": 0.0302,
|
1947 |
+
"step": 163056
|
1948 |
+
},
|
1949 |
+
{
|
1950 |
+
"epoch": 12.962305986696231,
|
1951 |
+
"grad_norm": 0.38117051124572754,
|
1952 |
+
"learning_rate": 3.518847006651885e-05,
|
1953 |
+
"loss": 0.0302,
|
1954 |
+
"step": 163688
|
1955 |
+
},
|
1956 |
+
{
|
1957 |
+
"epoch": 13.0,
|
1958 |
+
"eval_gen_len": 12.34829582,
|
1959 |
+
"eval_loss": 0.032208241522312164,
|
1960 |
+
"eval_runtime": 516.3051,
|
1961 |
+
"eval_samples_per_second": 103.082,
|
1962 |
+
"eval_steps_per_second": 1.611,
|
1963 |
+
"eval_wordacc": 0.94701439,
|
1964 |
+
"eval_wordacc_oov": 0.90429282,
|
1965 |
+
"step": 164164
|
1966 |
+
},
|
1967 |
+
{
|
1968 |
+
"epoch": 13.012353500158378,
|
1969 |
+
"grad_norm": 0.5338648557662964,
|
1970 |
+
"learning_rate": 3.4938232499208115e-05,
|
1971 |
+
"loss": 0.0273,
|
1972 |
+
"step": 164320
|
1973 |
+
},
|
1974 |
+
{
|
1975 |
+
"epoch": 13.062401013620526,
|
1976 |
+
"grad_norm": 0.5787109136581421,
|
1977 |
+
"learning_rate": 3.468799493189737e-05,
|
1978 |
+
"loss": 0.0278,
|
1979 |
+
"step": 164952
|
1980 |
+
},
|
1981 |
+
{
|
1982 |
+
"epoch": 13.112448527082673,
|
1983 |
+
"grad_norm": 3.402815818786621,
|
1984 |
+
"learning_rate": 3.4437757364586636e-05,
|
1985 |
+
"loss": 0.0309,
|
1986 |
+
"step": 165584
|
1987 |
+
},
|
1988 |
+
{
|
1989 |
+
"epoch": 13.162496040544822,
|
1990 |
+
"grad_norm": 0.3387065827846527,
|
1991 |
+
"learning_rate": 3.41875197972759e-05,
|
1992 |
+
"loss": 0.0266,
|
1993 |
+
"step": 166216
|
1994 |
+
},
|
1995 |
+
{
|
1996 |
+
"epoch": 13.212543554006968,
|
1997 |
+
"grad_norm": 0.6726759076118469,
|
1998 |
+
"learning_rate": 3.393728222996516e-05,
|
1999 |
+
"loss": 0.0257,
|
2000 |
+
"step": 166848
|
2001 |
+
},
|
2002 |
+
{
|
2003 |
+
"epoch": 13.262591067469117,
|
2004 |
+
"grad_norm": 0.039511628448963165,
|
2005 |
+
"learning_rate": 3.368704466265442e-05,
|
2006 |
+
"loss": 0.0285,
|
2007 |
+
"step": 167480
|
2008 |
+
},
|
2009 |
+
{
|
2010 |
+
"epoch": 13.312638580931264,
|
2011 |
+
"grad_norm": 0.16470862925052643,
|
2012 |
+
"learning_rate": 3.343680709534368e-05,
|
2013 |
+
"loss": 0.0265,
|
2014 |
+
"step": 168112
|
2015 |
+
},
|
2016 |
+
{
|
2017 |
+
"epoch": 13.362686094393412,
|
2018 |
+
"grad_norm": 1.1072640419006348,
|
2019 |
+
"learning_rate": 3.318656952803295e-05,
|
2020 |
+
"loss": 0.0289,
|
2021 |
+
"step": 168744
|
2022 |
+
},
|
2023 |
+
{
|
2024 |
+
"epoch": 13.412733607855559,
|
2025 |
+
"grad_norm": 0.4822203516960144,
|
2026 |
+
"learning_rate": 3.293633196072221e-05,
|
2027 |
+
"loss": 0.0282,
|
2028 |
+
"step": 169376
|
2029 |
+
},
|
2030 |
+
{
|
2031 |
+
"epoch": 13.462781121317708,
|
2032 |
+
"grad_norm": 0.29173898696899414,
|
2033 |
+
"learning_rate": 3.268609439341147e-05,
|
2034 |
+
"loss": 0.025,
|
2035 |
+
"step": 170008
|
2036 |
+
},
|
2037 |
+
{
|
2038 |
+
"epoch": 13.512828634779854,
|
2039 |
+
"grad_norm": 0.12304585427045822,
|
2040 |
+
"learning_rate": 3.243585682610073e-05,
|
2041 |
+
"loss": 0.0286,
|
2042 |
+
"step": 170640
|
2043 |
+
},
|
2044 |
+
{
|
2045 |
+
"epoch": 13.562876148242001,
|
2046 |
+
"grad_norm": 0.02250618301331997,
|
2047 |
+
"learning_rate": 3.218561925878999e-05,
|
2048 |
+
"loss": 0.0261,
|
2049 |
+
"step": 171272
|
2050 |
+
},
|
2051 |
+
{
|
2052 |
+
"epoch": 13.61292366170415,
|
2053 |
+
"grad_norm": 1.214078664779663,
|
2054 |
+
"learning_rate": 3.193538169147926e-05,
|
2055 |
+
"loss": 0.0265,
|
2056 |
+
"step": 171904
|
2057 |
+
},
|
2058 |
+
{
|
2059 |
+
"epoch": 13.662971175166296,
|
2060 |
+
"grad_norm": 1.080644965171814,
|
2061 |
+
"learning_rate": 3.168514412416851e-05,
|
2062 |
+
"loss": 0.0264,
|
2063 |
+
"step": 172536
|
2064 |
+
},
|
2065 |
+
{
|
2066 |
+
"epoch": 13.713018688628445,
|
2067 |
+
"grad_norm": 0.47105857729911804,
|
2068 |
+
"learning_rate": 3.143490655685778e-05,
|
2069 |
+
"loss": 0.03,
|
2070 |
+
"step": 173168
|
2071 |
+
},
|
2072 |
+
{
|
2073 |
+
"epoch": 13.763066202090592,
|
2074 |
+
"grad_norm": 0.8579492568969727,
|
2075 |
+
"learning_rate": 3.118466898954704e-05,
|
2076 |
+
"loss": 0.03,
|
2077 |
+
"step": 173800
|
2078 |
+
},
|
2079 |
+
{
|
2080 |
+
"epoch": 13.81311371555274,
|
2081 |
+
"grad_norm": 1.1287840604782104,
|
2082 |
+
"learning_rate": 3.0934431422236307e-05,
|
2083 |
+
"loss": 0.0274,
|
2084 |
+
"step": 174432
|
2085 |
+
},
|
2086 |
+
{
|
2087 |
+
"epoch": 13.863161229014887,
|
2088 |
+
"grad_norm": 1.2223260402679443,
|
2089 |
+
"learning_rate": 3.068419385492556e-05,
|
2090 |
+
"loss": 0.0275,
|
2091 |
+
"step": 175064
|
2092 |
+
},
|
2093 |
+
{
|
2094 |
+
"epoch": 13.913208742477035,
|
2095 |
+
"grad_norm": 0.6263837814331055,
|
2096 |
+
"learning_rate": 3.0433956287614825e-05,
|
2097 |
+
"loss": 0.03,
|
2098 |
+
"step": 175696
|
2099 |
+
},
|
2100 |
+
{
|
2101 |
+
"epoch": 13.963256255939182,
|
2102 |
+
"grad_norm": 2.3187966346740723,
|
2103 |
+
"learning_rate": 3.018371872030409e-05,
|
2104 |
+
"loss": 0.0277,
|
2105 |
+
"step": 176328
|
2106 |
+
},
|
2107 |
+
{
|
2108 |
+
"epoch": 14.0,
|
2109 |
+
"eval_gen_len": 12.35064447,
|
2110 |
+
"eval_loss": 0.031636711210012436,
|
2111 |
+
"eval_runtime": 514.9724,
|
2112 |
+
"eval_samples_per_second": 103.349,
|
2113 |
+
"eval_steps_per_second": 1.616,
|
2114 |
+
"eval_wordacc": 0.94785991,
|
2115 |
+
"eval_wordacc_oov": 0.9040336,
|
2116 |
+
"step": 176792
|
2117 |
+
},
|
2118 |
+
{
|
2119 |
+
"epoch": 14.01330376940133,
|
2120 |
+
"grad_norm": 0.09013538807630539,
|
2121 |
+
"learning_rate": 2.993348115299335e-05,
|
2122 |
+
"loss": 0.0291,
|
2123 |
+
"step": 176960
|
2124 |
+
},
|
2125 |
+
{
|
2126 |
+
"epoch": 14.063351282863477,
|
2127 |
+
"grad_norm": 0.7201142907142639,
|
2128 |
+
"learning_rate": 2.9683243585682614e-05,
|
2129 |
+
"loss": 0.0216,
|
2130 |
+
"step": 177592
|
2131 |
+
},
|
2132 |
+
{
|
2133 |
+
"epoch": 14.113398796325626,
|
2134 |
+
"grad_norm": 0.2142077535390854,
|
2135 |
+
"learning_rate": 2.943300601837187e-05,
|
2136 |
+
"loss": 0.0264,
|
2137 |
+
"step": 178224
|
2138 |
+
},
|
2139 |
+
{
|
2140 |
+
"epoch": 14.163446309787773,
|
2141 |
+
"grad_norm": 0.618773341178894,
|
2142 |
+
"learning_rate": 2.918276845106114e-05,
|
2143 |
+
"loss": 0.0248,
|
2144 |
+
"step": 178856
|
2145 |
+
},
|
2146 |
+
{
|
2147 |
+
"epoch": 14.213493823249921,
|
2148 |
+
"grad_norm": 1.0433090925216675,
|
2149 |
+
"learning_rate": 2.8932530883750396e-05,
|
2150 |
+
"loss": 0.0286,
|
2151 |
+
"step": 179488
|
2152 |
+
},
|
2153 |
+
{
|
2154 |
+
"epoch": 14.263541336712068,
|
2155 |
+
"grad_norm": 0.11434757709503174,
|
2156 |
+
"learning_rate": 2.8682293316439656e-05,
|
2157 |
+
"loss": 0.0253,
|
2158 |
+
"step": 180120
|
2159 |
+
},
|
2160 |
+
{
|
2161 |
+
"epoch": 14.313588850174217,
|
2162 |
+
"grad_norm": 1.1828645467758179,
|
2163 |
+
"learning_rate": 2.843205574912892e-05,
|
2164 |
+
"loss": 0.0245,
|
2165 |
+
"step": 180752
|
2166 |
+
},
|
2167 |
+
{
|
2168 |
+
"epoch": 14.363636363636363,
|
2169 |
+
"grad_norm": 0.3703102171421051,
|
2170 |
+
"learning_rate": 2.818181818181818e-05,
|
2171 |
+
"loss": 0.0261,
|
2172 |
+
"step": 181384
|
2173 |
+
},
|
2174 |
+
{
|
2175 |
+
"epoch": 14.413683877098512,
|
2176 |
+
"grad_norm": 0.20109039545059204,
|
2177 |
+
"learning_rate": 2.7931580614507445e-05,
|
2178 |
+
"loss": 0.0255,
|
2179 |
+
"step": 182016
|
2180 |
+
},
|
2181 |
+
{
|
2182 |
+
"epoch": 14.463731390560659,
|
2183 |
+
"grad_norm": 0.627756655216217,
|
2184 |
+
"learning_rate": 2.7681343047196706e-05,
|
2185 |
+
"loss": 0.0228,
|
2186 |
+
"step": 182648
|
2187 |
+
},
|
2188 |
+
{
|
2189 |
+
"epoch": 14.513778904022807,
|
2190 |
+
"grad_norm": 0.04185875132679939,
|
2191 |
+
"learning_rate": 2.743110547988597e-05,
|
2192 |
+
"loss": 0.0237,
|
2193 |
+
"step": 183280
|
2194 |
+
},
|
2195 |
+
{
|
2196 |
+
"epoch": 14.563826417484954,
|
2197 |
+
"grad_norm": 0.10042019188404083,
|
2198 |
+
"learning_rate": 2.718086791257523e-05,
|
2199 |
+
"loss": 0.0272,
|
2200 |
+
"step": 183912
|
2201 |
+
},
|
2202 |
+
{
|
2203 |
+
"epoch": 14.613873930947102,
|
2204 |
+
"grad_norm": 1.0942548513412476,
|
2205 |
+
"learning_rate": 2.693063034526449e-05,
|
2206 |
+
"loss": 0.0242,
|
2207 |
+
"step": 184544
|
2208 |
+
},
|
2209 |
+
{
|
2210 |
+
"epoch": 14.66392144440925,
|
2211 |
+
"grad_norm": 0.7228376269340515,
|
2212 |
+
"learning_rate": 2.6680392777953756e-05,
|
2213 |
+
"loss": 0.0252,
|
2214 |
+
"step": 185176
|
2215 |
+
},
|
2216 |
+
{
|
2217 |
+
"epoch": 14.713968957871398,
|
2218 |
+
"grad_norm": 0.6470568776130676,
|
2219 |
+
"learning_rate": 2.6430155210643016e-05,
|
2220 |
+
"loss": 0.0269,
|
2221 |
+
"step": 185808
|
2222 |
+
},
|
2223 |
+
{
|
2224 |
+
"epoch": 14.764016471333544,
|
2225 |
+
"grad_norm": 0.4635055363178253,
|
2226 |
+
"learning_rate": 2.617991764333228e-05,
|
2227 |
+
"loss": 0.0243,
|
2228 |
+
"step": 186440
|
2229 |
+
},
|
2230 |
+
{
|
2231 |
+
"epoch": 14.814063984795691,
|
2232 |
+
"grad_norm": 0.43463265895843506,
|
2233 |
+
"learning_rate": 2.592968007602154e-05,
|
2234 |
+
"loss": 0.0265,
|
2235 |
+
"step": 187072
|
2236 |
+
},
|
2237 |
+
{
|
2238 |
+
"epoch": 14.86411149825784,
|
2239 |
+
"grad_norm": 0.7284203767776489,
|
2240 |
+
"learning_rate": 2.5679442508710805e-05,
|
2241 |
+
"loss": 0.0269,
|
2242 |
+
"step": 187704
|
2243 |
+
},
|
2244 |
+
{
|
2245 |
+
"epoch": 14.914159011719988,
|
2246 |
+
"grad_norm": 1.462714433670044,
|
2247 |
+
"learning_rate": 2.5429204941400066e-05,
|
2248 |
+
"loss": 0.0236,
|
2249 |
+
"step": 188336
|
2250 |
+
},
|
2251 |
+
{
|
2252 |
+
"epoch": 14.964206525182135,
|
2253 |
+
"grad_norm": 0.029578888788819313,
|
2254 |
+
"learning_rate": 2.5178967374089323e-05,
|
2255 |
+
"loss": 0.0277,
|
2256 |
+
"step": 188968
|
2257 |
+
},
|
2258 |
+
{
|
2259 |
+
"epoch": 15.0,
|
2260 |
+
"eval_gen_len": 12.35139604,
|
2261 |
+
"eval_loss": 0.032311730086803436,
|
2262 |
+
"eval_runtime": 516.3173,
|
2263 |
+
"eval_samples_per_second": 103.08,
|
2264 |
+
"eval_steps_per_second": 1.611,
|
2265 |
+
"eval_wordacc": 0.94876179,
|
2266 |
+
"eval_wordacc_oov": 0.90299668,
|
2267 |
+
"step": 189420
|
2268 |
+
},
|
2269 |
+
{
|
2270 |
+
"epoch": 15.014254038644282,
|
2271 |
+
"grad_norm": 0.3032616376876831,
|
2272 |
+
"learning_rate": 2.4928729806778587e-05,
|
2273 |
+
"loss": 0.0275,
|
2274 |
+
"step": 189600
|
2275 |
+
},
|
2276 |
+
{
|
2277 |
+
"epoch": 15.06430155210643,
|
2278 |
+
"grad_norm": 0.08022745698690414,
|
2279 |
+
"learning_rate": 2.467849223946785e-05,
|
2280 |
+
"loss": 0.0212,
|
2281 |
+
"step": 190232
|
2282 |
+
},
|
2283 |
+
{
|
2284 |
+
"epoch": 15.114349065568577,
|
2285 |
+
"grad_norm": 0.06174493208527565,
|
2286 |
+
"learning_rate": 2.4428254672157112e-05,
|
2287 |
+
"loss": 0.0233,
|
2288 |
+
"step": 190864
|
2289 |
+
},
|
2290 |
+
{
|
2291 |
+
"epoch": 15.164396579030726,
|
2292 |
+
"grad_norm": 0.7364438772201538,
|
2293 |
+
"learning_rate": 2.4178017104846373e-05,
|
2294 |
+
"loss": 0.0214,
|
2295 |
+
"step": 191496
|
2296 |
+
},
|
2297 |
+
{
|
2298 |
+
"epoch": 15.214444092492872,
|
2299 |
+
"grad_norm": 0.022581493481993675,
|
2300 |
+
"learning_rate": 2.3927779537535637e-05,
|
2301 |
+
"loss": 0.0236,
|
2302 |
+
"step": 192128
|
2303 |
+
},
|
2304 |
+
{
|
2305 |
+
"epoch": 15.26449160595502,
|
2306 |
+
"grad_norm": 0.03313857316970825,
|
2307 |
+
"learning_rate": 2.3677541970224898e-05,
|
2308 |
+
"loss": 0.0246,
|
2309 |
+
"step": 192760
|
2310 |
+
},
|
2311 |
+
{
|
2312 |
+
"epoch": 15.314539119417168,
|
2313 |
+
"grad_norm": 0.7156698703765869,
|
2314 |
+
"learning_rate": 2.3427304402914162e-05,
|
2315 |
+
"loss": 0.0263,
|
2316 |
+
"step": 193392
|
2317 |
+
},
|
2318 |
+
{
|
2319 |
+
"epoch": 15.364586632879316,
|
2320 |
+
"grad_norm": 0.626669704914093,
|
2321 |
+
"learning_rate": 2.3177066835603423e-05,
|
2322 |
+
"loss": 0.0219,
|
2323 |
+
"step": 194024
|
2324 |
+
},
|
2325 |
+
{
|
2326 |
+
"epoch": 15.414634146341463,
|
2327 |
+
"grad_norm": 1.3695429563522339,
|
2328 |
+
"learning_rate": 2.2926829268292687e-05,
|
2329 |
+
"loss": 0.0244,
|
2330 |
+
"step": 194656
|
2331 |
+
},
|
2332 |
+
{
|
2333 |
+
"epoch": 15.464681659803611,
|
2334 |
+
"grad_norm": 0.03239896893501282,
|
2335 |
+
"learning_rate": 2.2676591700981947e-05,
|
2336 |
+
"loss": 0.0217,
|
2337 |
+
"step": 195288
|
2338 |
+
},
|
2339 |
+
{
|
2340 |
+
"epoch": 15.514729173265758,
|
2341 |
+
"grad_norm": 0.4434269666671753,
|
2342 |
+
"learning_rate": 2.2426354133671208e-05,
|
2343 |
+
"loss": 0.026,
|
2344 |
+
"step": 195920
|
2345 |
+
},
|
2346 |
+
{
|
2347 |
+
"epoch": 15.564776686727907,
|
2348 |
+
"grad_norm": 0.11541499197483063,
|
2349 |
+
"learning_rate": 2.217611656636047e-05,
|
2350 |
+
"loss": 0.0251,
|
2351 |
+
"step": 196552
|
2352 |
+
},
|
2353 |
+
{
|
2354 |
+
"epoch": 15.614824200190053,
|
2355 |
+
"grad_norm": 0.11521150171756744,
|
2356 |
+
"learning_rate": 2.1925878999049733e-05,
|
2357 |
+
"loss": 0.0234,
|
2358 |
+
"step": 197184
|
2359 |
+
},
|
2360 |
+
{
|
2361 |
+
"epoch": 15.664871713652202,
|
2362 |
+
"grad_norm": 0.10318135470151901,
|
2363 |
+
"learning_rate": 2.1675641431738994e-05,
|
2364 |
+
"loss": 0.0239,
|
2365 |
+
"step": 197816
|
2366 |
+
},
|
2367 |
+
{
|
2368 |
+
"epoch": 15.714919227114349,
|
2369 |
+
"grad_norm": 0.181732639670372,
|
2370 |
+
"learning_rate": 2.1425403864428258e-05,
|
2371 |
+
"loss": 0.0231,
|
2372 |
+
"step": 198448
|
2373 |
+
},
|
2374 |
+
{
|
2375 |
+
"epoch": 15.764966740576497,
|
2376 |
+
"grad_norm": 0.43891534209251404,
|
2377 |
+
"learning_rate": 2.117516629711752e-05,
|
2378 |
+
"loss": 0.0258,
|
2379 |
+
"step": 199080
|
2380 |
+
},
|
2381 |
+
{
|
2382 |
+
"epoch": 15.815014254038644,
|
2383 |
+
"grad_norm": 0.2669106125831604,
|
2384 |
+
"learning_rate": 2.0924928729806782e-05,
|
2385 |
+
"loss": 0.0258,
|
2386 |
+
"step": 199712
|
2387 |
+
},
|
2388 |
+
{
|
2389 |
+
"epoch": 15.865061767500793,
|
2390 |
+
"grad_norm": 0.787148118019104,
|
2391 |
+
"learning_rate": 2.067469116249604e-05,
|
2392 |
+
"loss": 0.023,
|
2393 |
+
"step": 200344
|
2394 |
+
},
|
2395 |
+
{
|
2396 |
+
"epoch": 15.91510928096294,
|
2397 |
+
"grad_norm": 0.09489845484495163,
|
2398 |
+
"learning_rate": 2.0424453595185304e-05,
|
2399 |
+
"loss": 0.0233,
|
2400 |
+
"step": 200976
|
2401 |
+
},
|
2402 |
+
{
|
2403 |
+
"epoch": 15.965156794425088,
|
2404 |
+
"grad_norm": 0.07817026227712631,
|
2405 |
+
"learning_rate": 2.0174216027874565e-05,
|
2406 |
+
"loss": 0.0245,
|
2407 |
+
"step": 201608
|
2408 |
+
},
|
2409 |
+
{
|
2410 |
+
"epoch": 16.0,
|
2411 |
+
"eval_gen_len": 12.35013716,
|
2412 |
+
"eval_loss": 0.03139541670680046,
|
2413 |
+
"eval_runtime": 516.248,
|
2414 |
+
"eval_samples_per_second": 103.094,
|
2415 |
+
"eval_steps_per_second": 1.612,
|
2416 |
+
"eval_wordacc": 0.95127955,
|
2417 |
+
"eval_wordacc_oov": 0.90724803,
|
2418 |
+
"step": 202048
|
2419 |
+
},
|
2420 |
+
{
|
2421 |
+
"epoch": 16.015204307887235,
|
2422 |
+
"grad_norm": 0.026963738724589348,
|
2423 |
+
"learning_rate": 1.992397846056383e-05,
|
2424 |
+
"loss": 0.0209,
|
2425 |
+
"step": 202240
|
2426 |
+
},
|
2427 |
+
{
|
2428 |
+
"epoch": 16.06525182134938,
|
2429 |
+
"grad_norm": 0.031288400292396545,
|
2430 |
+
"learning_rate": 1.967374089325309e-05,
|
2431 |
+
"loss": 0.0211,
|
2432 |
+
"step": 202872
|
2433 |
+
},
|
2434 |
+
{
|
2435 |
+
"epoch": 16.11529933481153,
|
2436 |
+
"grad_norm": 0.04831352084875107,
|
2437 |
+
"learning_rate": 1.9423503325942354e-05,
|
2438 |
+
"loss": 0.0216,
|
2439 |
+
"step": 203504
|
2440 |
+
},
|
2441 |
+
{
|
2442 |
+
"epoch": 16.16534684827368,
|
2443 |
+
"grad_norm": 0.3481796383857727,
|
2444 |
+
"learning_rate": 1.9173265758631614e-05,
|
2445 |
+
"loss": 0.0215,
|
2446 |
+
"step": 204136
|
2447 |
+
},
|
2448 |
+
{
|
2449 |
+
"epoch": 16.215394361735825,
|
2450 |
+
"grad_norm": 0.21324285864830017,
|
2451 |
+
"learning_rate": 1.8923028191320875e-05,
|
2452 |
+
"loss": 0.0218,
|
2453 |
+
"step": 204768
|
2454 |
+
},
|
2455 |
+
{
|
2456 |
+
"epoch": 16.265441875197972,
|
2457 |
+
"grad_norm": 1.6428892612457275,
|
2458 |
+
"learning_rate": 1.8672790624010136e-05,
|
2459 |
+
"loss": 0.021,
|
2460 |
+
"step": 205400
|
2461 |
+
},
|
2462 |
+
{
|
2463 |
+
"epoch": 16.31548938866012,
|
2464 |
+
"grad_norm": 0.2430795133113861,
|
2465 |
+
"learning_rate": 1.84225530566994e-05,
|
2466 |
+
"loss": 0.02,
|
2467 |
+
"step": 206032
|
2468 |
+
},
|
2469 |
+
{
|
2470 |
+
"epoch": 16.36553690212227,
|
2471 |
+
"grad_norm": 0.5367847681045532,
|
2472 |
+
"learning_rate": 1.817231548938866e-05,
|
2473 |
+
"loss": 0.0209,
|
2474 |
+
"step": 206664
|
2475 |
+
},
|
2476 |
+
{
|
2477 |
+
"epoch": 16.415584415584416,
|
2478 |
+
"grad_norm": 0.08580495417118073,
|
2479 |
+
"learning_rate": 1.7922077922077925e-05,
|
2480 |
+
"loss": 0.0204,
|
2481 |
+
"step": 207296
|
2482 |
+
},
|
2483 |
+
{
|
2484 |
+
"epoch": 16.465631929046562,
|
2485 |
+
"grad_norm": 0.6704065203666687,
|
2486 |
+
"learning_rate": 1.7671840354767185e-05,
|
2487 |
+
"loss": 0.0218,
|
2488 |
+
"step": 207928
|
2489 |
+
},
|
2490 |
+
{
|
2491 |
+
"epoch": 16.51567944250871,
|
2492 |
+
"grad_norm": 1.014145016670227,
|
2493 |
+
"learning_rate": 1.742160278745645e-05,
|
2494 |
+
"loss": 0.0216,
|
2495 |
+
"step": 208560
|
2496 |
+
},
|
2497 |
+
{
|
2498 |
+
"epoch": 16.56572695597086,
|
2499 |
+
"grad_norm": 0.040132321417331696,
|
2500 |
+
"learning_rate": 1.7171365220145707e-05,
|
2501 |
+
"loss": 0.0249,
|
2502 |
+
"step": 209192
|
2503 |
+
},
|
2504 |
+
{
|
2505 |
+
"epoch": 16.615774469433006,
|
2506 |
+
"grad_norm": 1.3288291692733765,
|
2507 |
+
"learning_rate": 1.692112765283497e-05,
|
2508 |
+
"loss": 0.0216,
|
2509 |
+
"step": 209824
|
2510 |
+
},
|
2511 |
+
{
|
2512 |
+
"epoch": 16.665821982895153,
|
2513 |
+
"grad_norm": 0.6535305976867676,
|
2514 |
+
"learning_rate": 1.667089008552423e-05,
|
2515 |
+
"loss": 0.0221,
|
2516 |
+
"step": 210456
|
2517 |
+
},
|
2518 |
+
{
|
2519 |
+
"epoch": 16.7158694963573,
|
2520 |
+
"grad_norm": 0.516942024230957,
|
2521 |
+
"learning_rate": 1.6420652518213496e-05,
|
2522 |
+
"loss": 0.0209,
|
2523 |
+
"step": 211088
|
2524 |
+
},
|
2525 |
+
{
|
2526 |
+
"epoch": 16.76591700981945,
|
2527 |
+
"grad_norm": 0.6040933728218079,
|
2528 |
+
"learning_rate": 1.6170414950902756e-05,
|
2529 |
+
"loss": 0.0231,
|
2530 |
+
"step": 211720
|
2531 |
+
},
|
2532 |
+
{
|
2533 |
+
"epoch": 16.815964523281597,
|
2534 |
+
"grad_norm": 0.07999061793088913,
|
2535 |
+
"learning_rate": 1.592017738359202e-05,
|
2536 |
+
"loss": 0.023,
|
2537 |
+
"step": 212352
|
2538 |
+
},
|
2539 |
+
{
|
2540 |
+
"epoch": 16.866012036743744,
|
2541 |
+
"grad_norm": 2.897904872894287,
|
2542 |
+
"learning_rate": 1.566993981628128e-05,
|
2543 |
+
"loss": 0.0227,
|
2544 |
+
"step": 212984
|
2545 |
+
},
|
2546 |
+
{
|
2547 |
+
"epoch": 16.91605955020589,
|
2548 |
+
"grad_norm": 0.2583966553211212,
|
2549 |
+
"learning_rate": 1.5419702248970542e-05,
|
2550 |
+
"loss": 0.0218,
|
2551 |
+
"step": 213616
|
2552 |
+
},
|
2553 |
+
{
|
2554 |
+
"epoch": 16.96610706366804,
|
2555 |
+
"grad_norm": 1.3506840467453003,
|
2556 |
+
"learning_rate": 1.5169464681659804e-05,
|
2557 |
+
"loss": 0.0235,
|
2558 |
+
"step": 214248
|
2559 |
+
},
|
2560 |
+
{
|
2561 |
+
"epoch": 17.0,
|
2562 |
+
"eval_gen_len": 12.35109541,
|
2563 |
+
"eval_loss": 0.03128550201654434,
|
2564 |
+
"eval_runtime": 515.496,
|
2565 |
+
"eval_samples_per_second": 103.244,
|
2566 |
+
"eval_steps_per_second": 1.614,
|
2567 |
+
"eval_wordacc": 0.95197475,
|
2568 |
+
"eval_wordacc_oov": 0.90714434,
|
2569 |
+
"step": 214676
|
2570 |
+
},
|
2571 |
+
{
|
2572 |
+
"epoch": 17.016154577130187,
|
2573 |
+
"grad_norm": 0.28379184007644653,
|
2574 |
+
"learning_rate": 1.4919227114349067e-05,
|
2575 |
+
"loss": 0.0196,
|
2576 |
+
"step": 214880
|
2577 |
+
},
|
2578 |
+
{
|
2579 |
+
"epoch": 17.066202090592334,
|
2580 |
+
"grad_norm": 0.2609073221683502,
|
2581 |
+
"learning_rate": 1.4668989547038327e-05,
|
2582 |
+
"loss": 0.0199,
|
2583 |
+
"step": 215512
|
2584 |
+
},
|
2585 |
+
{
|
2586 |
+
"epoch": 17.11624960405448,
|
2587 |
+
"grad_norm": 0.22402538359165192,
|
2588 |
+
"learning_rate": 1.441875197972759e-05,
|
2589 |
+
"loss": 0.0203,
|
2590 |
+
"step": 216144
|
2591 |
+
},
|
2592 |
+
{
|
2593 |
+
"epoch": 17.16629711751663,
|
2594 |
+
"grad_norm": 0.020612630993127823,
|
2595 |
+
"learning_rate": 1.4168514412416852e-05,
|
2596 |
+
"loss": 0.0196,
|
2597 |
+
"step": 216776
|
2598 |
+
},
|
2599 |
+
{
|
2600 |
+
"epoch": 17.216344630978778,
|
2601 |
+
"grad_norm": 0.036708053201436996,
|
2602 |
+
"learning_rate": 1.3918276845106115e-05,
|
2603 |
+
"loss": 0.0164,
|
2604 |
+
"step": 217408
|
2605 |
+
},
|
2606 |
+
{
|
2607 |
+
"epoch": 17.266392144440925,
|
2608 |
+
"grad_norm": 1.0852469205856323,
|
2609 |
+
"learning_rate": 1.3668039277795375e-05,
|
2610 |
+
"loss": 0.0197,
|
2611 |
+
"step": 218040
|
2612 |
+
},
|
2613 |
+
{
|
2614 |
+
"epoch": 17.31643965790307,
|
2615 |
+
"grad_norm": 0.08419329673051834,
|
2616 |
+
"learning_rate": 1.3417801710484638e-05,
|
2617 |
+
"loss": 0.0223,
|
2618 |
+
"step": 218672
|
2619 |
+
},
|
2620 |
+
{
|
2621 |
+
"epoch": 17.366487171365222,
|
2622 |
+
"grad_norm": 0.8586103320121765,
|
2623 |
+
"learning_rate": 1.31675641431739e-05,
|
2624 |
+
"loss": 0.0217,
|
2625 |
+
"step": 219304
|
2626 |
+
},
|
2627 |
+
{
|
2628 |
+
"epoch": 17.41653468482737,
|
2629 |
+
"grad_norm": 0.41822636127471924,
|
2630 |
+
"learning_rate": 1.2917326575863162e-05,
|
2631 |
+
"loss": 0.0203,
|
2632 |
+
"step": 219936
|
2633 |
+
},
|
2634 |
+
{
|
2635 |
+
"epoch": 17.466582198289515,
|
2636 |
+
"grad_norm": 0.16271276772022247,
|
2637 |
+
"learning_rate": 1.2667089008552425e-05,
|
2638 |
+
"loss": 0.0211,
|
2639 |
+
"step": 220568
|
2640 |
+
},
|
2641 |
+
{
|
2642 |
+
"epoch": 17.516629711751662,
|
2643 |
+
"grad_norm": 0.7023501992225647,
|
2644 |
+
"learning_rate": 1.2416851441241686e-05,
|
2645 |
+
"loss": 0.0197,
|
2646 |
+
"step": 221200
|
2647 |
+
},
|
2648 |
+
{
|
2649 |
+
"epoch": 17.56667722521381,
|
2650 |
+
"grad_norm": 0.038701217621564865,
|
2651 |
+
"learning_rate": 1.2166613873930948e-05,
|
2652 |
+
"loss": 0.022,
|
2653 |
+
"step": 221832
|
2654 |
+
},
|
2655 |
+
{
|
2656 |
+
"epoch": 17.61672473867596,
|
2657 |
+
"grad_norm": 0.03481602668762207,
|
2658 |
+
"learning_rate": 1.191637630662021e-05,
|
2659 |
+
"loss": 0.0178,
|
2660 |
+
"step": 222464
|
2661 |
+
},
|
2662 |
+
{
|
2663 |
+
"epoch": 17.666772252138106,
|
2664 |
+
"grad_norm": 0.3818272352218628,
|
2665 |
+
"learning_rate": 1.1666138739309473e-05,
|
2666 |
+
"loss": 0.0204,
|
2667 |
+
"step": 223096
|
2668 |
+
},
|
2669 |
+
{
|
2670 |
+
"epoch": 17.716819765600253,
|
2671 |
+
"grad_norm": 1.1725687980651855,
|
2672 |
+
"learning_rate": 1.1415901171998734e-05,
|
2673 |
+
"loss": 0.0214,
|
2674 |
+
"step": 223728
|
2675 |
+
},
|
2676 |
+
{
|
2677 |
+
"epoch": 17.7668672790624,
|
2678 |
+
"grad_norm": 0.5618239641189575,
|
2679 |
+
"learning_rate": 1.1165663604687996e-05,
|
2680 |
+
"loss": 0.0198,
|
2681 |
+
"step": 224360
|
2682 |
+
},
|
2683 |
+
{
|
2684 |
+
"epoch": 17.81691479252455,
|
2685 |
+
"grad_norm": 0.11917304992675781,
|
2686 |
+
"learning_rate": 1.0915426037377258e-05,
|
2687 |
+
"loss": 0.0203,
|
2688 |
+
"step": 224992
|
2689 |
+
},
|
2690 |
+
{
|
2691 |
+
"epoch": 17.866962305986696,
|
2692 |
+
"grad_norm": 0.1154344379901886,
|
2693 |
+
"learning_rate": 1.066518847006652e-05,
|
2694 |
+
"loss": 0.0217,
|
2695 |
+
"step": 225624
|
2696 |
+
},
|
2697 |
+
{
|
2698 |
+
"epoch": 17.917009819448843,
|
2699 |
+
"grad_norm": 0.12502028048038483,
|
2700 |
+
"learning_rate": 1.0414950902755781e-05,
|
2701 |
+
"loss": 0.0217,
|
2702 |
+
"step": 226256
|
2703 |
+
},
|
2704 |
+
{
|
2705 |
+
"epoch": 17.96705733291099,
|
2706 |
+
"grad_norm": 0.27321675419807434,
|
2707 |
+
"learning_rate": 1.0164713335445044e-05,
|
2708 |
+
"loss": 0.0206,
|
2709 |
+
"step": 226888
|
2710 |
+
},
|
2711 |
+
{
|
2712 |
+
"epoch": 18.0,
|
2713 |
+
"eval_gen_len": 12.35015595,
|
2714 |
+
"eval_loss": 0.030980365350842476,
|
2715 |
+
"eval_runtime": 547.6731,
|
2716 |
+
"eval_samples_per_second": 97.178,
|
2717 |
+
"eval_steps_per_second": 1.519,
|
2718 |
+
"eval_wordacc": 0.95308331,
|
2719 |
+
"eval_wordacc_oov": 0.90844048,
|
2720 |
+
"step": 227304
|
2721 |
+
},
|
2722 |
+
{
|
2723 |
+
"epoch": 18.01710484637314,
|
2724 |
+
"grad_norm": 0.15305259823799133,
|
2725 |
+
"learning_rate": 9.914475768134306e-06,
|
2726 |
+
"loss": 0.018,
|
2727 |
+
"step": 227520
|
2728 |
+
},
|
2729 |
+
{
|
2730 |
+
"epoch": 18.067152359835287,
|
2731 |
+
"grad_norm": 2.5299997329711914,
|
2732 |
+
"learning_rate": 9.664238200823567e-06,
|
2733 |
+
"loss": 0.0183,
|
2734 |
+
"step": 228152
|
2735 |
+
},
|
2736 |
+
{
|
2737 |
+
"epoch": 18.117199873297434,
|
2738 |
+
"grad_norm": 0.2780356705188751,
|
2739 |
+
"learning_rate": 9.41400063351283e-06,
|
2740 |
+
"loss": 0.0179,
|
2741 |
+
"step": 228784
|
2742 |
+
},
|
2743 |
+
{
|
2744 |
+
"epoch": 18.16724738675958,
|
2745 |
+
"grad_norm": 0.06621097773313522,
|
2746 |
+
"learning_rate": 9.163763066202092e-06,
|
2747 |
+
"loss": 0.0196,
|
2748 |
+
"step": 229416
|
2749 |
+
},
|
2750 |
+
{
|
2751 |
+
"epoch": 18.21729490022173,
|
2752 |
+
"grad_norm": 0.05437783896923065,
|
2753 |
+
"learning_rate": 8.913525498891354e-06,
|
2754 |
+
"loss": 0.0186,
|
2755 |
+
"step": 230048
|
2756 |
+
},
|
2757 |
+
{
|
2758 |
+
"epoch": 18.267342413683878,
|
2759 |
+
"grad_norm": 0.62905353307724,
|
2760 |
+
"learning_rate": 8.663287931580615e-06,
|
2761 |
+
"loss": 0.0202,
|
2762 |
+
"step": 230680
|
2763 |
+
},
|
2764 |
+
{
|
2765 |
+
"epoch": 18.317389927146024,
|
2766 |
+
"grad_norm": 0.03628231957554817,
|
2767 |
+
"learning_rate": 8.413050364269877e-06,
|
2768 |
+
"loss": 0.0186,
|
2769 |
+
"step": 231312
|
2770 |
+
},
|
2771 |
+
{
|
2772 |
+
"epoch": 18.36743744060817,
|
2773 |
+
"grad_norm": 0.1532977819442749,
|
2774 |
+
"learning_rate": 8.16281279695914e-06,
|
2775 |
+
"loss": 0.0197,
|
2776 |
+
"step": 231944
|
2777 |
+
},
|
2778 |
+
{
|
2779 |
+
"epoch": 18.41748495407032,
|
2780 |
+
"grad_norm": 0.15254627168178558,
|
2781 |
+
"learning_rate": 7.9125752296484e-06,
|
2782 |
+
"loss": 0.0177,
|
2783 |
+
"step": 232576
|
2784 |
+
},
|
2785 |
+
{
|
2786 |
+
"epoch": 18.467532467532468,
|
2787 |
+
"grad_norm": 1.0562399625778198,
|
2788 |
+
"learning_rate": 7.662337662337663e-06,
|
2789 |
+
"loss": 0.0172,
|
2790 |
+
"step": 233208
|
2791 |
+
},
|
2792 |
+
{
|
2793 |
+
"epoch": 18.517579980994615,
|
2794 |
+
"grad_norm": 0.4453679323196411,
|
2795 |
+
"learning_rate": 7.412100095026925e-06,
|
2796 |
+
"loss": 0.0197,
|
2797 |
+
"step": 233840
|
2798 |
+
},
|
2799 |
+
{
|
2800 |
+
"epoch": 18.56762749445676,
|
2801 |
+
"grad_norm": 0.26340222358703613,
|
2802 |
+
"learning_rate": 7.161862527716187e-06,
|
2803 |
+
"loss": 0.018,
|
2804 |
+
"step": 234472
|
2805 |
+
},
|
2806 |
+
{
|
2807 |
+
"epoch": 18.617675007918912,
|
2808 |
+
"grad_norm": 0.052670177072286606,
|
2809 |
+
"learning_rate": 6.911624960405448e-06,
|
2810 |
+
"loss": 0.0184,
|
2811 |
+
"step": 235104
|
2812 |
+
},
|
2813 |
+
{
|
2814 |
+
"epoch": 18.66772252138106,
|
2815 |
+
"grad_norm": 0.026885686442255974,
|
2816 |
+
"learning_rate": 6.661387393094711e-06,
|
2817 |
+
"loss": 0.0183,
|
2818 |
+
"step": 235736
|
2819 |
+
},
|
2820 |
+
{
|
2821 |
+
"epoch": 18.717770034843205,
|
2822 |
+
"grad_norm": 0.34494689106941223,
|
2823 |
+
"learning_rate": 6.411149825783973e-06,
|
2824 |
+
"loss": 0.0192,
|
2825 |
+
"step": 236368
|
2826 |
+
},
|
2827 |
+
{
|
2828 |
+
"epoch": 18.767817548305352,
|
2829 |
+
"grad_norm": 0.0610111765563488,
|
2830 |
+
"learning_rate": 6.160912258473235e-06,
|
2831 |
+
"loss": 0.0179,
|
2832 |
+
"step": 237000
|
2833 |
+
},
|
2834 |
+
{
|
2835 |
+
"epoch": 18.817865061767503,
|
2836 |
+
"grad_norm": 0.02205130271613598,
|
2837 |
+
"learning_rate": 5.910674691162496e-06,
|
2838 |
+
"loss": 0.0218,
|
2839 |
+
"step": 237632
|
2840 |
+
},
|
2841 |
+
{
|
2842 |
+
"epoch": 18.86791257522965,
|
2843 |
+
"grad_norm": 0.046364884823560715,
|
2844 |
+
"learning_rate": 5.660437123851759e-06,
|
2845 |
+
"loss": 0.0167,
|
2846 |
+
"step": 238264
|
2847 |
+
},
|
2848 |
+
{
|
2849 |
+
"epoch": 18.917960088691796,
|
2850 |
+
"grad_norm": 0.8253034949302673,
|
2851 |
+
"learning_rate": 5.41019955654102e-06,
|
2852 |
+
"loss": 0.0206,
|
2853 |
+
"step": 238896
|
2854 |
+
},
|
2855 |
+
{
|
2856 |
+
"epoch": 18.968007602153943,
|
2857 |
+
"grad_norm": 0.07394740730524063,
|
2858 |
+
"learning_rate": 5.159961989230283e-06,
|
2859 |
+
"loss": 0.0178,
|
2860 |
+
"step": 239528
|
2861 |
+
},
|
2862 |
+
{
|
2863 |
+
"epoch": 19.0,
|
2864 |
+
"eval_gen_len": 12.35068205,
|
2865 |
+
"eval_loss": 0.03073255531489849,
|
2866 |
+
"eval_runtime": 534.9855,
|
2867 |
+
"eval_samples_per_second": 99.483,
|
2868 |
+
"eval_steps_per_second": 1.555,
|
2869 |
+
"eval_wordacc": 0.95447371,
|
2870 |
+
"eval_wordacc_oov": 0.9093737,
|
2871 |
+
"step": 239932
|
2872 |
+
},
|
2873 |
+
{
|
2874 |
+
"epoch": 19.01805511561609,
|
2875 |
+
"grad_norm": 0.05381210148334503,
|
2876 |
+
"learning_rate": 4.909724421919544e-06,
|
2877 |
+
"loss": 0.017,
|
2878 |
+
"step": 240160
|
2879 |
+
},
|
2880 |
+
{
|
2881 |
+
"epoch": 19.06810262907824,
|
2882 |
+
"grad_norm": 0.05161185935139656,
|
2883 |
+
"learning_rate": 4.659486854608807e-06,
|
2884 |
+
"loss": 0.0184,
|
2885 |
+
"step": 240792
|
2886 |
+
},
|
2887 |
+
{
|
2888 |
+
"epoch": 19.118150142540387,
|
2889 |
+
"grad_norm": 0.054974284023046494,
|
2890 |
+
"learning_rate": 4.409249287298068e-06,
|
2891 |
+
"loss": 0.0165,
|
2892 |
+
"step": 241424
|
2893 |
+
},
|
2894 |
+
{
|
2895 |
+
"epoch": 19.168197656002533,
|
2896 |
+
"grad_norm": 0.4919557571411133,
|
2897 |
+
"learning_rate": 4.15901171998733e-06,
|
2898 |
+
"loss": 0.0171,
|
2899 |
+
"step": 242056
|
2900 |
+
},
|
2901 |
+
{
|
2902 |
+
"epoch": 19.21824516946468,
|
2903 |
+
"grad_norm": 0.8104053735733032,
|
2904 |
+
"learning_rate": 3.908774152676592e-06,
|
2905 |
+
"loss": 0.0161,
|
2906 |
+
"step": 242688
|
2907 |
+
},
|
2908 |
+
{
|
2909 |
+
"epoch": 19.26829268292683,
|
2910 |
+
"grad_norm": 0.04452453926205635,
|
2911 |
+
"learning_rate": 3.6585365853658537e-06,
|
2912 |
+
"loss": 0.017,
|
2913 |
+
"step": 243320
|
2914 |
+
},
|
2915 |
+
{
|
2916 |
+
"epoch": 19.318340196388977,
|
2917 |
+
"grad_norm": 2.039726972579956,
|
2918 |
+
"learning_rate": 3.408299018055116e-06,
|
2919 |
+
"loss": 0.0188,
|
2920 |
+
"step": 243952
|
2921 |
+
},
|
2922 |
+
{
|
2923 |
+
"epoch": 19.368387709851124,
|
2924 |
+
"grad_norm": 0.05839849263429642,
|
2925 |
+
"learning_rate": 3.1580614507443777e-06,
|
2926 |
+
"loss": 0.0174,
|
2927 |
+
"step": 244584
|
2928 |
+
},
|
2929 |
+
{
|
2930 |
+
"epoch": 19.41843522331327,
|
2931 |
+
"grad_norm": 0.06209117919206619,
|
2932 |
+
"learning_rate": 2.9078238834336396e-06,
|
2933 |
+
"loss": 0.0177,
|
2934 |
+
"step": 245216
|
2935 |
+
},
|
2936 |
+
{
|
2937 |
+
"epoch": 19.46848273677542,
|
2938 |
+
"grad_norm": 0.023634545505046844,
|
2939 |
+
"learning_rate": 2.6575863161229016e-06,
|
2940 |
+
"loss": 0.0187,
|
2941 |
+
"step": 245848
|
2942 |
+
},
|
2943 |
+
{
|
2944 |
+
"epoch": 19.518530250237568,
|
2945 |
+
"grad_norm": 0.4597359001636505,
|
2946 |
+
"learning_rate": 2.4073487488121636e-06,
|
2947 |
+
"loss": 0.0157,
|
2948 |
+
"step": 246480
|
2949 |
+
},
|
2950 |
+
{
|
2951 |
+
"epoch": 19.568577763699714,
|
2952 |
+
"grad_norm": 0.03822094574570656,
|
2953 |
+
"learning_rate": 2.1571111815014256e-06,
|
2954 |
+
"loss": 0.0191,
|
2955 |
+
"step": 247112
|
2956 |
+
},
|
2957 |
+
{
|
2958 |
+
"epoch": 19.61862527716186,
|
2959 |
+
"grad_norm": 0.028475474566221237,
|
2960 |
+
"learning_rate": 1.9068736141906876e-06,
|
2961 |
+
"loss": 0.0184,
|
2962 |
+
"step": 247744
|
2963 |
+
},
|
2964 |
+
{
|
2965 |
+
"epoch": 19.66867279062401,
|
2966 |
+
"grad_norm": 2.0368409156799316,
|
2967 |
+
"learning_rate": 1.6566360468799495e-06,
|
2968 |
+
"loss": 0.016,
|
2969 |
+
"step": 248376
|
2970 |
+
},
|
2971 |
+
{
|
2972 |
+
"epoch": 19.71872030408616,
|
2973 |
+
"grad_norm": 0.6395711898803711,
|
2974 |
+
"learning_rate": 1.4063984795692113e-06,
|
2975 |
+
"loss": 0.0182,
|
2976 |
+
"step": 249008
|
2977 |
+
},
|
2978 |
+
{
|
2979 |
+
"epoch": 19.768767817548305,
|
2980 |
+
"grad_norm": 0.11478591710329056,
|
2981 |
+
"learning_rate": 1.1561609122584733e-06,
|
2982 |
+
"loss": 0.0172,
|
2983 |
+
"step": 249640
|
2984 |
+
},
|
2985 |
+
{
|
2986 |
+
"epoch": 19.818815331010452,
|
2987 |
+
"grad_norm": 2.0897409915924072,
|
2988 |
+
"learning_rate": 9.059233449477353e-07,
|
2989 |
+
"loss": 0.0178,
|
2990 |
+
"step": 250272
|
2991 |
+
},
|
2992 |
+
{
|
2993 |
+
"epoch": 19.868862844472602,
|
2994 |
+
"grad_norm": 1.6272854804992676,
|
2995 |
+
"learning_rate": 6.556857776369973e-07,
|
2996 |
+
"loss": 0.0165,
|
2997 |
+
"step": 250904
|
2998 |
+
},
|
2999 |
+
{
|
3000 |
+
"epoch": 19.91891035793475,
|
3001 |
+
"grad_norm": 0.018388045951724052,
|
3002 |
+
"learning_rate": 4.0544821032625913e-07,
|
3003 |
+
"loss": 0.0153,
|
3004 |
+
"step": 251536
|
3005 |
+
},
|
3006 |
+
{
|
3007 |
+
"epoch": 19.968957871396896,
|
3008 |
+
"grad_norm": 0.097270667552948,
|
3009 |
+
"learning_rate": 1.5521064301552106e-07,
|
3010 |
+
"loss": 0.016,
|
3011 |
+
"step": 252168
|
3012 |
+
},
|
3013 |
+
{
|
3014 |
+
"epoch": 20.0,
|
3015 |
+
"eval_gen_len": 12.35156514,
|
3016 |
+
"eval_loss": 0.030767865478992462,
|
3017 |
+
"eval_runtime": 535.8673,
|
3018 |
+
"eval_samples_per_second": 99.319,
|
3019 |
+
"eval_steps_per_second": 1.553,
|
3020 |
+
"eval_wordacc": 0.95458645,
|
3021 |
+
"eval_wordacc_oov": 0.90963293,
|
3022 |
+
"step": 252560
|
3023 |
+
},
|
3024 |
+
{
|
3025 |
+
"epoch": 20.0,
|
3026 |
+
"step": 252560,
|
3027 |
+
"total_flos": 1486519029399552.0,
|
3028 |
+
"train_loss": 0.04684994914012881,
|
3029 |
+
"train_runtime": 19459.2196,
|
3030 |
+
"train_samples_per_second": 103.832,
|
3031 |
+
"train_steps_per_second": 12.979
|
3032 |
+
}
|
3033 |
+
],
|
3034 |
+
"logging_steps": 632,
|
3035 |
+
"max_steps": 252560,
|
3036 |
+
"num_input_tokens_seen": 0,
|
3037 |
+
"num_train_epochs": 20,
|
3038 |
+
"save_steps": 500,
|
3039 |
+
"stateful_callbacks": {
|
3040 |
+
"TrainerControl": {
|
3041 |
+
"args": {
|
3042 |
+
"should_epoch_stop": false,
|
3043 |
+
"should_evaluate": false,
|
3044 |
+
"should_log": false,
|
3045 |
+
"should_save": true,
|
3046 |
+
"should_training_stop": true
|
3047 |
+
},
|
3048 |
+
"attributes": {}
|
3049 |
+
}
|
3050 |
+
},
|
3051 |
+
"total_flos": 1486519029399552.0,
|
3052 |
+
"train_batch_size": 8,
|
3053 |
+
"trial_name": null,
|
3054 |
+
"trial_params": null
|
3055 |
+
}
|
training_args.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4209630a4722251c95631a073744ab4282fa18d97dd15d63b8d9b8a9c8420475
|
3 |
+
size 5304
|