File size: 1,044 Bytes
0f4ddf6
94c1893
66c6131
3b94107
fbf603f
56fe2d3
1e73aec
20f5ed8
bc9be57
86c79cd
607bfff
1783dd2
4ee6ab4
d075457
9c99b98
4353f90
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Started at: 15:20:57
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Training loss: 0.3436830922961235 - MAE: 0.4572523557050669
Validation loss : 0.26256299018859863 - MAE: 0.40728324728553433
Epoch: 1
Training loss: 0.21011242344975473 - MAE: 0.35190288965025124
Validation loss : 0.18477652966976166 - MAE: 0.3296733541388515
Epoch: 2
Training loss: 0.19629175141453742 - MAE: 0.33688825701191527
Validation loss : 0.18152975397450583 - MAE: 0.32990742521562766
Epoch: 3
Training loss: 0.1831872671842575 - MAE: 0.3247871467745211
Validation loss : 0.17427935344832285 - MAE: 0.3213832702684575
Epoch: 4
Training loss: 0.1773839347064495 - MAE: 0.31992620730258825