File size: 867 Bytes
d6c6bc9
3ddd5e8
91af41d
347bab1
ba4e8b8
1a31b88
89fb0a4
cbdc3a8
d22e7a3
89bc6e7
1748266
4a88d0c
ee484e2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Started at: 04:40:02
norbert, 0.001, 256
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {})
Epoch: 0
Training loss: 0.490340878367424 - MAE: 0.5502402056555946
Validation loss : 0.2153187460369534 - MAE: 0.34847071386417267
Epoch: 1
Training loss: 0.20058341324329376 - MAE: 0.3423091295468801
Validation loss : 0.17609875235292646 - MAE: 0.3207490493525149
Epoch: 2
Training loss: 0.17480310261249543 - MAE: 0.31785240619760485
Validation loss : 0.17104972898960114 - MAE: 0.31704982503328616
Epoch: 3