home-standard-results / 06-05-2023_norbert_batchsize_256_all_units_0.0001_1_200_freeze_True_earlystop_3.txt
ececet's picture
commit files to HF hub
9a9cef0
raw
history blame
4.35 kB
Started at: 17:01:49
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {})
Epoch: 0
Started at: 17:03:05
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {})
Started at: 17:03:18
({'architectures': ['BertForMaskedLM'], 'attention_probs_dropout_prob': 0.1, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'type_vocab_size': 2, 'vocab_size': 32922, '_commit_hash': '075d4e3705390691013e859faffc5696d071e33b'}, {})
Epoch: 0
Training loss: 1.5561567282676696 - MAE: 1.1770511100698875
Validation loss : 0.939958651860555 - MAE: 0.8876047142856601
Epoch: 1
Training loss: 0.6363586246967315 - MAE: 0.6928943697649724
Validation loss : 0.33423560195498997 - MAE: 0.45375851608477785
Epoch: 2
Training loss: 0.2759279823303223 - MAE: 0.4052092643476477
Validation loss : 0.20369169281588662 - MAE: 0.3433357920074413
Epoch: 3
Training loss: 0.20272068321704864 - MAE: 0.3399617345503301
Validation loss : 0.19545602964030373 - MAE: 0.3416856898288076
Epoch: 4
Training loss: 0.1912100625038147 - MAE: 0.3326939730215834
Validation loss : 0.19473741783036125 - MAE: 0.3417824430892937
Epoch: 5
Training loss: 0.19083771824836732 - MAE: 0.33137282498844967
Validation loss : 0.19215407967567444 - MAE: 0.33912431491105655
Epoch: 6
Training loss: 0.18923945665359498 - MAE: 0.33146465315973606
Validation loss : 0.18989400565624237 - MAE: 0.33674625605103725
Epoch: 7
Training loss: 0.18950583219528197 - MAE: 0.33087934479225867
Validation loss : 0.18789740900198618 - MAE: 0.33464578460452915
Epoch: 8
Training loss: 0.1856936401128769 - MAE: 0.3275132613322806
Validation loss : 0.1864797506067488 - MAE: 0.33324104491047596
Epoch: 9
Training loss: 0.1836453753709793 - MAE: 0.3255223596708951
Validation loss : 0.18501530918810102 - MAE: 0.3316937400385192
Epoch: 10
Training loss: 0.18246697008609772 - MAE: 0.32322550938186817
Validation loss : 0.18376492957274118 - MAE: 0.33042730021623856
Epoch: 11
Training loss: 0.1811656492948532 - MAE: 0.32419844121595726
Validation loss : 0.18262643449836308 - MAE: 0.32929917485750665
Epoch: 12
Training loss: 0.18094802320003509 - MAE: 0.32277421430430436
Validation loss : 0.18164582550525665 - MAE: 0.3283432280968596
Epoch: 13
Training loss: 0.1792392772436142 - MAE: 0.3209453182214294
Validation loss : 0.18065459695127276 - MAE: 0.3273255372779676
Epoch: 14
Training loss: 0.17940414905548097 - MAE: 0.3216579290630419
Validation loss : 0.17975443104902902 - MAE: 0.3264252069830345
Epoch: 15
Training loss: 0.17663683056831359 - MAE: 0.31768596289501483
Validation loss : 0.17874346673488617 - MAE: 0.3253559679886541
Epoch: 16
Training loss: 0.17649461567401886 - MAE: 0.3184662269845276
Validation loss : 0.17812747756640115 - MAE: 0.32483073473627894
Epoch: 17
Training loss: 0.17691377580165862 - MAE: 0.31822431281431196
Validation loss : 0.17722326185968187 - MAE: 0.3238687376329823
Epoch: 18
Training loss: 0.1768798279762268 - MAE: 0.3188422584293619
Validation loss : 0.17655367155869803 - MAE: 0.32321426470152304
Epoch: 19
Training loss: 0.17445733308792113 - MAE: 0.31640191359506836
Validation loss : 0.17601815693908268 - MAE: 0.3227230412018289
Epoch: 20
Training loss: 0.17480538487434388 - MAE: 0.3161891511192775
Validation loss : 0.1753774748908149 - MAE: 0.32208324574836594
Epoch: 21
Training loss: 0.1732891607284546 - MAE: 0.31558897873451675
Validation loss : 0.1746254563331604 - MAE: 0.3213051447966539
Epoch: 22
Training loss: 0.17241579234600068 - MAE: 0.3140265985892311