lmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-12-01_txt_vis_concat_enc_9_10_11_12_gate
This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.9605
- Accuracy: 0.785
- Exit 0 Accuracy: 0.0625
- Exit 1 Accuracy: 0.2425
- Exit 2 Accuracy: 0.5225
- Exit 3 Accuracy: 0.72
- Exit 4 Accuracy: 0.785
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 24
- total_train_batch_size: 96
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy |
---|---|---|---|---|---|---|---|---|---|
No log | 0.96 | 8 | 2.6964 | 0.1225 | 0.055 | 0.0625 | 0.0625 | 0.0625 | 0.1225 |
No log | 1.96 | 16 | 2.6306 | 0.1775 | 0.05 | 0.0625 | 0.0625 | 0.0625 | 0.1775 |
No log | 2.96 | 24 | 2.5176 | 0.2325 | 0.045 | 0.0625 | 0.0625 | 0.0625 | 0.2325 |
No log | 3.96 | 32 | 2.3854 | 0.28 | 0.045 | 0.0625 | 0.0625 | 0.0625 | 0.28 |
No log | 4.96 | 40 | 2.2424 | 0.335 | 0.04 | 0.0625 | 0.0625 | 0.0625 | 0.335 |
No log | 5.96 | 48 | 2.0887 | 0.395 | 0.0425 | 0.0625 | 0.0625 | 0.0625 | 0.395 |
No log | 6.96 | 56 | 1.9008 | 0.5125 | 0.0425 | 0.0625 | 0.0625 | 0.0625 | 0.5125 |
No log | 7.96 | 64 | 1.7061 | 0.575 | 0.04 | 0.0625 | 0.0625 | 0.0625 | 0.575 |
No log | 8.96 | 72 | 1.5366 | 0.6075 | 0.0375 | 0.0625 | 0.0625 | 0.0625 | 0.6075 |
No log | 9.96 | 80 | 1.3956 | 0.6475 | 0.0375 | 0.0625 | 0.0625 | 0.0625 | 0.6475 |
No log | 10.96 | 88 | 1.2953 | 0.675 | 0.0275 | 0.0625 | 0.0625 | 0.0675 | 0.675 |
No log | 11.96 | 96 | 1.2023 | 0.6775 | 0.025 | 0.0625 | 0.0625 | 0.07 | 0.6775 |
No log | 12.96 | 104 | 1.1167 | 0.72 | 0.0325 | 0.0625 | 0.0625 | 0.0875 | 0.72 |
No log | 13.96 | 112 | 1.0342 | 0.73 | 0.03 | 0.0625 | 0.0625 | 0.1025 | 0.73 |
No log | 14.96 | 120 | 1.0137 | 0.7375 | 0.0325 | 0.0625 | 0.0625 | 0.115 | 0.7375 |
No log | 15.96 | 128 | 0.9790 | 0.7375 | 0.0325 | 0.0625 | 0.0625 | 0.1175 | 0.7375 |
No log | 16.96 | 136 | 0.9306 | 0.7675 | 0.035 | 0.0625 | 0.0625 | 0.1575 | 0.7675 |
No log | 17.96 | 144 | 0.8941 | 0.77 | 0.04 | 0.0625 | 0.0625 | 0.14 | 0.77 |
No log | 18.96 | 152 | 0.8953 | 0.765 | 0.0425 | 0.0625 | 0.0625 | 0.1825 | 0.765 |
No log | 19.96 | 160 | 0.8898 | 0.77 | 0.04 | 0.0625 | 0.0625 | 0.2175 | 0.77 |
No log | 20.96 | 168 | 0.8756 | 0.7725 | 0.04 | 0.0625 | 0.0625 | 0.2675 | 0.7725 |
No log | 21.96 | 176 | 0.9026 | 0.755 | 0.045 | 0.0625 | 0.1 | 0.4175 | 0.755 |
No log | 22.96 | 184 | 0.8717 | 0.7725 | 0.05 | 0.0625 | 0.1175 | 0.4225 | 0.7725 |
No log | 23.96 | 192 | 0.9194 | 0.7525 | 0.05 | 0.0625 | 0.15 | 0.4775 | 0.7525 |
No log | 24.96 | 200 | 0.8943 | 0.775 | 0.05 | 0.0675 | 0.1925 | 0.525 | 0.775 |
No log | 25.96 | 208 | 0.8964 | 0.77 | 0.0525 | 0.0625 | 0.215 | 0.5225 | 0.77 |
No log | 26.96 | 216 | 0.9143 | 0.76 | 0.0525 | 0.0625 | 0.25 | 0.5525 | 0.76 |
No log | 27.96 | 224 | 0.9079 | 0.7775 | 0.0525 | 0.0625 | 0.29 | 0.56 | 0.7775 |
No log | 28.96 | 232 | 0.9018 | 0.7775 | 0.055 | 0.0675 | 0.315 | 0.59 | 0.7775 |
No log | 29.96 | 240 | 0.9091 | 0.7875 | 0.055 | 0.0725 | 0.355 | 0.615 | 0.7875 |
No log | 30.96 | 248 | 0.9056 | 0.785 | 0.0625 | 0.0925 | 0.3775 | 0.64 | 0.785 |
No log | 31.96 | 256 | 0.9164 | 0.79 | 0.06 | 0.125 | 0.42 | 0.6775 | 0.79 |
No log | 32.96 | 264 | 0.9293 | 0.7875 | 0.0625 | 0.1425 | 0.4625 | 0.685 | 0.7875 |
No log | 33.96 | 272 | 0.9669 | 0.7725 | 0.0575 | 0.215 | 0.48 | 0.6875 | 0.7725 |
No log | 34.96 | 280 | 0.9342 | 0.785 | 0.06 | 0.23 | 0.4725 | 0.69 | 0.785 |
No log | 35.96 | 288 | 0.9481 | 0.7725 | 0.0625 | 0.205 | 0.4525 | 0.6525 | 0.7725 |
No log | 36.96 | 296 | 0.9447 | 0.7775 | 0.06 | 0.24 | 0.485 | 0.6875 | 0.7775 |
No log | 37.96 | 304 | 0.9494 | 0.7925 | 0.0575 | 0.24 | 0.5025 | 0.7025 | 0.7925 |
No log | 38.96 | 312 | 0.9329 | 0.775 | 0.0575 | 0.2225 | 0.46 | 0.695 | 0.775 |
No log | 39.96 | 320 | 0.9247 | 0.7875 | 0.06 | 0.23 | 0.4725 | 0.6725 | 0.7875 |
No log | 40.96 | 328 | 0.9184 | 0.7925 | 0.06 | 0.2325 | 0.465 | 0.665 | 0.7925 |
No log | 41.96 | 336 | 0.9608 | 0.8025 | 0.06 | 0.1975 | 0.4625 | 0.65 | 0.8025 |
No log | 42.96 | 344 | 0.9499 | 0.7875 | 0.06 | 0.2075 | 0.445 | 0.64 | 0.7875 |
No log | 43.96 | 352 | 0.9789 | 0.7825 | 0.06 | 0.205 | 0.495 | 0.64 | 0.7825 |
No log | 44.96 | 360 | 0.9384 | 0.78 | 0.06 | 0.2125 | 0.49 | 0.6725 | 0.78 |
No log | 45.96 | 368 | 0.9734 | 0.77 | 0.06 | 0.2075 | 0.54 | 0.7125 | 0.77 |
No log | 46.96 | 376 | 0.9647 | 0.785 | 0.0625 | 0.215 | 0.5325 | 0.735 | 0.785 |
No log | 47.96 | 384 | 0.9484 | 0.78 | 0.0625 | 0.2225 | 0.515 | 0.725 | 0.78 |
No log | 48.96 | 392 | 0.9652 | 0.7875 | 0.0625 | 0.2275 | 0.505 | 0.7325 | 0.7875 |
No log | 49.96 | 400 | 0.9570 | 0.785 | 0.0625 | 0.22 | 0.4925 | 0.7225 | 0.785 |
No log | 50.96 | 408 | 0.9432 | 0.7975 | 0.0625 | 0.2075 | 0.52 | 0.7275 | 0.7975 |
No log | 51.96 | 416 | 0.9562 | 0.79 | 0.0625 | 0.225 | 0.5275 | 0.7325 | 0.79 |
No log | 52.96 | 424 | 0.9567 | 0.79 | 0.0625 | 0.2375 | 0.5325 | 0.72 | 0.79 |
No log | 53.96 | 432 | 0.9645 | 0.7875 | 0.0625 | 0.2425 | 0.5325 | 0.7175 | 0.7875 |
No log | 54.96 | 440 | 0.9721 | 0.7825 | 0.0625 | 0.25 | 0.5275 | 0.725 | 0.7825 |
No log | 55.96 | 448 | 0.9742 | 0.785 | 0.0625 | 0.2425 | 0.52 | 0.7275 | 0.785 |
No log | 56.96 | 456 | 0.9699 | 0.785 | 0.0625 | 0.24 | 0.5225 | 0.725 | 0.785 |
No log | 57.96 | 464 | 0.9637 | 0.785 | 0.0625 | 0.245 | 0.52 | 0.725 | 0.785 |
No log | 58.96 | 472 | 0.9614 | 0.785 | 0.0625 | 0.2425 | 0.525 | 0.72 | 0.785 |
No log | 59.96 | 480 | 0.9605 | 0.785 | 0.0625 | 0.2425 | 0.5225 | 0.72 | 0.785 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
- Downloads last month
- 4