File size: 1,703 Bytes
541caf0
6e84bc6
edbcc40
ba361ff
0340493
f5c3427
7bb66d8
28b6622
1c654e0
c9d9c4f
51efb02
9796f57
af1bbd5
c03595e
10b82b6
e5a37a8
c91bff0
d19ea8f
1349872
ea5fa3c
da1419f
e7c0bd2
b682d78
b36fc7f
0509a98
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
Started at: 09:50:17
nb-bert-base, 0.001, 64
({'_name_or_path': '/disk4/folder1/working/checkpoints/huggingface/native_pytorch/step4_8/', 'attention_probs_dropout_prob': 0.1, 'directionality': 'bidi', 'gradient_checkpointing': False, 'hidden_act': 'gelu', 'hidden_dropout_prob': 0.1, 'hidden_size': 768, 'initializer_range': 0.02, 'intermediate_size': 3072, 'layer_norm_eps': 1e-12, 'max_position_embeddings': 512, 'model_type': 'bert', 'num_attention_heads': 12, 'num_hidden_layers': 12, 'pad_token_id': 0, 'pooler_fc_size': 768, 'pooler_num_attention_heads': 12, 'pooler_num_fc_layers': 3, 'pooler_size_per_head': 128, 'pooler_type': 'first_token_transform', 'position_embedding_type': 'absolute', 'type_vocab_size': 2, 'vocab_size': 119547, '_commit_hash': '82b194c0b3ea1fcad65f1eceee04adb26f9f71ac'}, {})
Epoch: 0
Training loss: 0.2582416865560744 - MAE: 0.37947225695044484
Validation loss : 0.1635455173073393 - MAE: 0.3110491517398661
Epoch: 1
Training loss: 0.16106272955434492 - MAE: 0.3013588922425145
Validation loss : 0.15557559895696063 - MAE: 0.30050668306575834
Epoch: 2
Training loss: 0.15495470143628842 - MAE: 0.2955889167413265
Validation loss : 0.1551921604709192 - MAE: 0.29452050543601266
Epoch: 3
Training loss: 0.15183212577995628 - MAE: 0.2926895606129869
Validation loss : 0.15697873812733273 - MAE: 0.29536795510273695
Epoch: 4
Training loss: 0.15079305027470444 - MAE: 0.2918655737680265
Validation loss : 0.16038688498012948 - MAE: 0.29748381775815635
Epoch: 5
Training loss: 0.1489795818924904 - MAE: 0.2907857149496023
Validation loss : 0.15906145965511148 - MAE: 0.29660232101442935
Stopped after 5 epochs.
Prediction MAE: 0.2767
Finished at: 09:50:17
Time taken: 688 s.