home-standard-results / frozen_nb-bert-base_256_0.001_512.txt
ececet's picture
commit files to HF hub
97fe462
raw
history blame
3.3 kB
Started at: 18:07:27
nb-bert-base, 0.001, 256
({'_name_or_path': '/disk4/folder1/working/checkpoints/huggingface/native_pytorch/step4_8/', 'attention_probs_dropout_prob': 0.1, 'directionality': 'bidi', 'gradient_checkpointing': False, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'layer_norm_eps': 1e-12, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'pad_token_id': 0, 'pooler_fc_size': 768, 'pooler_num_attention_heads': 12, 'pooler_num_fc_layers': 3, 'pooler_size_per_head': 128, 'pooler_type': 'first_token_transform', 'position_embedding_type': 'absolute', 'type_vocab_size': 2, 'vocab_size': 119547, '_commit_hash': '82b194c0b3ea1fcad65f1eceee04adb26f9f71ac'}, {})
Epoch: 0
Training loss: 0.46166665077209473 - MAE: 0.532284292943485
Validation loss : 0.21881521575980717 - MAE: 0.35093047126771504
Epoch: 1
Training loss: 0.19428934633731842 - MAE: 0.3369849436358454
Validation loss : 0.17015555004278818 - MAE: 0.3188480005836794
Epoch: 2
Training loss: 0.17059184610843658 - MAE: 0.3135526208759563
Validation loss : 0.16331283085876042 - MAE: 0.3111560966605882
Epoch: 3
Training loss: 0.1648101744055748 - MAE: 0.30816099238754846
Validation loss : 0.15740243097146353 - MAE: 0.3036171154612899
Epoch: 4
Training loss: 0.16064140915870667 - MAE: 0.3034752969970701
Validation loss : 0.1544146024518543 - MAE: 0.3000950328112062
Epoch: 5
Training loss: 0.15893334209918974 - MAE: 0.30019076958852603
Validation loss : 0.15207635197374555 - MAE: 0.2973798486881051
Epoch: 6
Training loss: 0.15733025580644608 - MAE: 0.298333740307608
Validation loss : 0.15064537939098147 - MAE: 0.2956621068139803
Epoch: 7
Training loss: 0.1533995896577835 - MAE: 0.2953341499795722
Validation loss : 0.14883457786507076 - MAE: 0.2933132114117648
Epoch: 8
Training loss: 0.15376180291175842 - MAE: 0.2940413396593056
Validation loss : 0.1487489938735962 - MAE: 0.29380870235239887
Epoch: 9
Training loss: 0.1510305792093277 - MAE: 0.2921084560611642
Validation loss : 0.14708556234836578 - MAE: 0.2915289821381632
Epoch: 10
Training loss: 0.15110637426376342 - MAE: 0.2922121841602223
Validation loss : 0.14670881297853258 - MAE: 0.2911653635264904
Epoch: 11
Training loss: 0.1488349175453186 - MAE: 0.2889507358141949
Validation loss : 0.14464906023608315 - MAE: 0.28820148464954054
Epoch: 12
Training loss: 0.14891418755054475 - MAE: 0.29070937547388537
Validation loss : 0.14401954164107642 - MAE: 0.2873828016254966
Epoch: 13
Training loss: 0.1495919942855835 - MAE: 0.2898801773120928
Validation loss : 0.14271563208765453 - MAE: 0.2851468490550366
Epoch: 14
Training loss: 0.14855773389339447 - MAE: 0.28958241262300716
Validation loss : 0.14220061318741906 - MAE: 0.2845612303188776
Epoch: 15
Training loss: 0.14888249456882477 - MAE: 0.2899555840495182
Validation loss : 0.1420163835088412 - MAE: 0.2840199988754225
Epoch: 16
Training loss: 0.14737528800964356 - MAE: 0.2884658645299053
Validation loss : 0.14170359654559028 - MAE: 0.28285520851160917
Epoch: 17
Training loss: 0.14756221681833268 - MAE: 0.2894349343213634
Validation loss : 0.14145300537347794 - MAE: 0.28289574169003384
Epoch: 18
Training loss: 0.14682800233364104 - MAE: 0.288245846442026