File size: 973 Bytes
09f28a4
405bd7d
360738e
c473e7d
4a09f23
fd71835
d9d9225
ab447ed
f47ba1c
2617b20
d846c74
b857ae3
690b67a
4d26b86
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Started at: 11:24:54
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Training loss: 0.6718602430820465 - MAE: 0.6594824697225725
Validation loss : 0.20548770328362784 - MAE: 0.35383498846315387
Epoch: 1
Training loss: 0.2140548872947693 - MAE: 0.35666712704976117
Validation loss : 0.17712030311425528 - MAE: 0.32385391673658914
Epoch: 2
Training loss: 0.1737816858291626 - MAE: 0.31725219154273515
Validation loss : 0.16664134628242916 - MAE: 0.31069026065156913
Epoch: 3
Training loss: 0.16521128743886948 - MAE: 0.3069983786566015
Validation loss : 0.1642034633292092 - MAE: 0.3103794749395031