File size: 713 Bytes
a359b2c
74ff63e
b6f2008
7a00688
6ebe937
44b78fe
845a0b9
f777ecd
05eddfb
1
2
3
4
5
6
7
8
9
10
Started at: 10:42:59
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Training loss: 0.2707270306348801 - MAE: 0.4014548746281838
Validation loss : 0.18827014460283167 - MAE: 0.3350395914193632
Epoch: 1
Training loss: 0.18397963613271714 - MAE: 0.3274942520732981
Validation loss : 0.17896055035731373 - MAE: 0.32625804300542655
Epoch: 2