File size: 1,033 Bytes
e9d5dbe 49f2afa fdf4f7f bf22c0c d329034 354a495 e8439f0 53b5f45 e426a9f 344e3fb 190241b 1a0c460 8551b5e aade771 d373200 ff1c420 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
Started at: 14:58:07 ({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {}) Epoch: 0 Training loss: 0.7675554938614368 - MAE: 0.7218268842869793 Validation loss : 0.5840958555539449 - MAE: 0.5827894046582064 Epoch: 1 Training loss: 0.49415092915296555 - MAE: 0.5986839248419658 Validation loss : 0.4720619022846222 - MAE: 0.585200722348146 Epoch: 2 Training loss: 0.4484046511352062 - MAE: 0.554433520788571 Validation loss : 0.4777786135673523 - MAE: 0.5403046027523047 Epoch: 3 Training loss: 0.415366031229496 - MAE: 0.5435317536295668 Validation loss : 0.42990968624750775 - MAE: 0.5419536063906305 Epoch: 4 Training loss: 0.39412142150104046 - MAE: 0.5245434911285064 |