File size: 994 Bytes
13b707d fa68898 fc4ffc5 666c1da 2cf6185 7dc524f 3a96a9d bad8fa3 6a89303 2042157 998b782 828fa5e 4f7205e 16663dd a935332 2d40453 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
Started at: 09:42:17 norbert2, 0.005, 256 ({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {}) Epoch: 0 Training loss: 0.6569913045926528 - MAE: 0.6331589070184822 Validation loss : 0.3369679701955695 - MAE: 0.46112028136408306 Epoch: 1 Training loss: 0.348409844528545 - MAE: 0.4680741973736075 Validation loss : 0.34815617768388046 - MAE: 0.4621900997363863 Epoch: 2 Training loss: 0.3364206330342726 - MAE: 0.4603252193854543 Validation loss : 0.3215927089515485 - MAE: 0.44344506688545865 Epoch: 3 Training loss: 0.324529029564424 - MAE: 0.4517557093205867 Validation loss : 0.304106238641237 - MAE: 0.4338629278168784 Epoch: 4 |