home-standard-results / 03-05-2023_norbert2_batchsize_512_all_units_0.001_1_150_freeze_True.txt
ececet's picture
commit files to HF hub
1608fce
Started at: 13:38:19
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Started at: 13:39:29
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Started at: 13:41:56
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 50104, '_commit_hash': 'f22bb47f536f62edfcd86ca9320ade990eafbe22'}, {})
Epoch: 0
Training loss: 0.4006912616583017 - MAE: 0.505064727315284
Validation loss : 0.309359085559845 - MAE: 0.44078509929430004
Epoch: 1
Training loss: 0.250718348301374 - MAE: 0.3922167896234202
Validation loss : 0.1936800718307495 - MAE: 0.33703642252306854
Epoch: 2
Training loss: 0.21618989912363198 - MAE: 0.3581347894767805
Validation loss : 0.18340231478214264 - MAE: 0.33186761275995064
Epoch: 3
Training loss: 0.19920441508293152 - MAE: 0.3401495898235937
Validation loss : 0.1788225769996643 - MAE: 0.32578763746292233
Epoch: 4
Training loss: 0.1918170922077619 - MAE: 0.33396066058861956
Validation loss : 0.17603523135185242 - MAE: 0.32383651430230814
Epoch: 5
Training loss: 0.18379446978752428 - MAE: 0.3262024097906551
Validation loss : 0.17055903375148773 - MAE: 0.31736346780813546
Epoch: 6
Training loss: 0.18060297461656424 - MAE: 0.3232976047514557
Validation loss : 0.16787597239017488 - MAE: 0.3139610728375986
Epoch: 7
Training loss: 0.17637471740062421 - MAE: 0.3190542525274511
Validation loss : 0.1664418339729309 - MAE: 0.3116923743658102
Epoch: 8
Training loss: 0.17314187494608071 - MAE: 0.3164703095268888
Validation loss : 0.16483855247497559 - MAE: 0.31011736716454275
Epoch: 9
Training loss: 0.17189548795039838 - MAE: 0.31287147634421525
Validation loss : 0.16325874030590057 - MAE: 0.3085789200886582
Epoch: 10
Training loss: 0.1694010473214663 - MAE: 0.31118425956842755
Validation loss : 0.16111792027950286 - MAE: 0.3070655232845323
Epoch: 11
Training loss: 0.16910729843836564 - MAE: 0.30878990252768745
Validation loss : 0.16129685938358307 - MAE: 0.3079809592227729
Epoch: 12
Training loss: 0.16443614203196305 - MAE: 0.3071422337796407
Validation loss : 0.15956343114376068 - MAE: 0.3059515625935648
Epoch: 13
Training loss: 0.16316631436347961 - MAE: 0.30409203692834197
Validation loss : 0.15849626958370208 - MAE: 0.3045730253138591
Epoch: 14
Training loss: 0.1635306592171009 - MAE: 0.3037401073073271
Validation loss : 0.158303701877594 - MAE: 0.30481573363411335
Epoch: 15
Training loss: 0.15958423912525177 - MAE: 0.30241642588486894
Validation loss : 0.15796978175640106 - MAE: 0.30471954056125206
Epoch: 16
Training loss: 0.15946145011828497 - MAE: 0.30110866489241084
Validation loss : 0.157157564163208 - MAE: 0.3034879952867451
Epoch: 17
Training loss: 0.15820241662172171 - MAE: 0.2994550238328829
Validation loss : 0.15616505742073059 - MAE: 0.3022687875902045
Epoch: 18
Training loss: 0.1558972322023832 - MAE: 0.29737974784702687
Validation loss : 0.1558208167552948 - MAE: 0.30243176755634804
Epoch: 19
Training loss: 0.15588092116209176 - MAE: 0.297545939168588
Validation loss : 0.15458311438560485 - MAE: 0.2999680718614845
Epoch: 20
Training loss: 0.1549827937896435 - MAE: 0.2975774431377147
Validation loss : 0.15556134283542633 - MAE: 0.301918822118164