File size: 659 Bytes
266d7b6
16c74a3
8f22cb0
84ae48a
b10b71d
a144499
c8a2f55
ed350f7
1
2
3
4
5
6
7
8
9
Started at: 15:33:47
norbert, 5e-06, 128
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {})
Epoch: 0
Training loss: 0.3591454255580902 - MAE: 0.45179795905328163
Validation loss : 0.1890380075749229 - MAE: 0.3379535505810491
Epoch: 1
Training loss: 0.17689531534910202 - MAE: 0.3208967210731947