Started at: 11:58:16 nb-bert-base, 0.001, 64 ({'_name_or_path': '/disk4/folder1/working/checkpoints/huggingface/native_pytorch/step4_8/', 'attention_probs_dropout_prob': 0.1, 'directionality': 'bidi', 'gradient_checkpointing': False, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'layer_norm_eps': 1e-12, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'pad_token_id': 0, 'pooler_fc_size': 768, 'pooler_num_attention_heads': 12, 'pooler_num_fc_layers': 3, 'pooler_size_per_head': 128, 'pooler_type': 'first_token_transform', 'position_embedding_type': 'absolute', 'type_vocab_size': 2, 'vocab_size': 119547, '_commit_hash': '82b194c0b3ea1fcad65f1eceee04adb26f9f71ac'}, {}) Epoch: 0 Training loss: 0.2541032898606676 - MAE: 0.37744714470934815 Validation loss : 0.16125401561007355 - MAE: 0.31002220634618377 Epoch: 1 Training loss: 0.1594780596217724 - MAE: 0.3023192776542885 Validation loss : 0.15070902104630615 - MAE: 0.29424405533888026 Epoch: 2 Training loss: 0.15478725806631224 - MAE: 0.2983057011321247 Validation loss : 0.14809446172280746 - MAE: 0.2900488703402422 Epoch: 3 Training loss: 0.14879934384365273 - MAE: 0.2912055242431901 Validation loss : 0.15311477581659952 - MAE: 0.2928259934734315 Epoch: 4 Training loss: 0.14859358951298876 - MAE: 0.2918463923578645 Validation loss : 0.15150787410410968 - MAE: 0.29099765286956236 Epoch: 5 Training loss: 0.14651678171422747 - MAE: 0.290210786231306 Validation loss : 0.16598732430826535 - MAE: 0.3048088374997105 Stopped after 5 epochs. Prediction MAE: 0.2867 Finished at: 11:58:16