Started at: 16:58:38 | |
norbert, 1e-06, 128 | |
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {}) | |
Epoch: 0 | |
Training loss: 0.7940866667032241 - MAE: 0.7335820706168809 | |
Validation loss : 0.2009572711061029 - MAE: 0.347151339356108 | |
Epoch: 1 | |
Training loss: 0.19440142974257468 - MAE: 0.33696885945893024 | |
Validation loss : 0.19513289893374725 - MAE: 0.34191775396532464 | |
Epoch: 2 | |
Training loss: 0.1880853456258774 - MAE: 0.32884479290870544 | |