EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-12-05_txt_vis_concat_enc_10_gate
This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0000
- Accuracy: 0.75
- Exit 0 Accuracy: 0.055
- Exit 1 Accuracy: 0.22
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 24
- total_train_batch_size: 192
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy |
---|---|---|---|---|---|---|
No log | 0.96 | 4 | 2.7552 | 0.09 | 0.0425 | 0.0625 |
No log | 1.96 | 8 | 2.7092 | 0.15 | 0.0475 | 0.0625 |
No log | 2.96 | 12 | 2.6218 | 0.1825 | 0.0525 | 0.0625 |
No log | 3.96 | 16 | 2.5483 | 0.1925 | 0.0525 | 0.0625 |
No log | 4.96 | 20 | 2.4980 | 0.21 | 0.0525 | 0.0625 |
No log | 5.96 | 24 | 2.3901 | 0.28 | 0.0525 | 0.0625 |
No log | 6.96 | 28 | 2.2958 | 0.33 | 0.055 | 0.0625 |
No log | 7.96 | 32 | 2.2144 | 0.34 | 0.055 | 0.0625 |
No log | 8.96 | 36 | 2.1107 | 0.37 | 0.055 | 0.0625 |
No log | 9.96 | 40 | 1.9969 | 0.405 | 0.055 | 0.0625 |
No log | 10.96 | 44 | 1.8919 | 0.46 | 0.055 | 0.0625 |
No log | 11.96 | 48 | 1.7897 | 0.4975 | 0.055 | 0.0625 |
No log | 12.96 | 52 | 1.6686 | 0.525 | 0.055 | 0.0625 |
No log | 13.96 | 56 | 1.6167 | 0.555 | 0.055 | 0.0625 |
No log | 14.96 | 60 | 1.4750 | 0.605 | 0.055 | 0.0625 |
No log | 15.96 | 64 | 1.4324 | 0.6225 | 0.055 | 0.065 |
No log | 16.96 | 68 | 1.3211 | 0.645 | 0.055 | 0.0925 |
No log | 17.96 | 72 | 1.2686 | 0.6675 | 0.055 | 0.1025 |
No log | 18.96 | 76 | 1.2206 | 0.6725 | 0.055 | 0.115 |
No log | 19.96 | 80 | 1.1536 | 0.7025 | 0.055 | 0.115 |
No log | 20.96 | 84 | 1.1113 | 0.71 | 0.0525 | 0.115 |
No log | 21.96 | 88 | 1.0655 | 0.715 | 0.0525 | 0.1175 |
No log | 22.96 | 92 | 1.0423 | 0.735 | 0.0525 | 0.12 |
No log | 23.96 | 96 | 1.0043 | 0.735 | 0.0525 | 0.1175 |
No log | 24.96 | 100 | 1.0017 | 0.74 | 0.0525 | 0.12 |
No log | 25.96 | 104 | 1.0167 | 0.7175 | 0.0525 | 0.12 |
No log | 26.96 | 108 | 0.9570 | 0.74 | 0.0525 | 0.1175 |
No log | 27.96 | 112 | 0.9620 | 0.7425 | 0.0525 | 0.12 |
No log | 28.96 | 116 | 0.9466 | 0.7425 | 0.0525 | 0.1175 |
No log | 29.96 | 120 | 0.9441 | 0.7575 | 0.0525 | 0.12 |
No log | 30.96 | 124 | 0.9568 | 0.7375 | 0.0525 | 0.1175 |
No log | 31.96 | 128 | 0.9313 | 0.7525 | 0.0525 | 0.11 |
No log | 32.96 | 132 | 0.9330 | 0.74 | 0.0525 | 0.1025 |
No log | 33.96 | 136 | 0.9370 | 0.76 | 0.0525 | 0.12 |
No log | 34.96 | 140 | 0.9455 | 0.76 | 0.0525 | 0.1125 |
No log | 35.96 | 144 | 0.9459 | 0.7625 | 0.0525 | 0.1025 |
No log | 36.96 | 148 | 0.9418 | 0.7575 | 0.0525 | 0.0975 |
No log | 37.96 | 152 | 0.9352 | 0.755 | 0.0525 | 0.105 |
No log | 38.96 | 156 | 0.9377 | 0.7425 | 0.0525 | 0.1125 |
No log | 39.96 | 160 | 0.9341 | 0.7525 | 0.0525 | 0.1175 |
No log | 40.96 | 164 | 0.9452 | 0.7575 | 0.055 | 0.1475 |
No log | 41.96 | 168 | 0.9486 | 0.7575 | 0.055 | 0.175 |
No log | 42.96 | 172 | 0.9656 | 0.7525 | 0.055 | 0.1375 |
No log | 43.96 | 176 | 0.9723 | 0.7525 | 0.0575 | 0.1575 |
No log | 44.96 | 180 | 0.9682 | 0.75 | 0.0575 | 0.1775 |
No log | 45.96 | 184 | 0.9699 | 0.7575 | 0.0575 | 0.195 |
No log | 46.96 | 188 | 0.9695 | 0.7575 | 0.0575 | 0.1925 |
No log | 47.96 | 192 | 0.9850 | 0.75 | 0.0575 | 0.1975 |
No log | 48.96 | 196 | 0.9909 | 0.7575 | 0.0575 | 0.2075 |
No log | 49.96 | 200 | 0.9751 | 0.75 | 0.0575 | 0.205 |
No log | 50.96 | 204 | 0.9723 | 0.7525 | 0.0575 | 0.205 |
No log | 51.96 | 208 | 0.9829 | 0.75 | 0.0575 | 0.21 |
No log | 52.96 | 212 | 0.9833 | 0.755 | 0.0575 | 0.21 |
No log | 53.96 | 216 | 0.9789 | 0.7575 | 0.0575 | 0.2125 |
No log | 54.96 | 220 | 0.9781 | 0.7575 | 0.0575 | 0.2175 |
No log | 55.96 | 224 | 0.9853 | 0.755 | 0.0575 | 0.2225 |
No log | 56.96 | 228 | 0.9910 | 0.7525 | 0.0575 | 0.225 |
No log | 57.96 | 232 | 0.9973 | 0.75 | 0.055 | 0.2225 |
No log | 58.96 | 236 | 1.0001 | 0.75 | 0.055 | 0.2225 |
No log | 59.96 | 240 | 1.0000 | 0.75 | 0.055 | 0.22 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
- Downloads last month
- 1