--- license: cc-by-nc-sa-4.0 tags: - generated_from_trainer metrics: - accuracy model-index: - name: EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-07-07_went-g025 results: [] --- # EElayoutlmv3_jordyvl_rvl_cdip_100_examples_per_class_2023-07-07_went-g025 This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.0715 - Accuracy: 0.7275 - Exit 0 Accuracy: 0.1125 - Exit 1 Accuracy: 0.1525 - Exit 2 Accuracy: 0.185 - Exit 3 Accuracy: 0.0625 - Exit 4 Accuracy: 0.0625 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 12 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 24 - total_train_batch_size: 288 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Exit 0 Accuracy | Exit 1 Accuracy | Exit 2 Accuracy | Exit 3 Accuracy | Exit 4 Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------------:|:---------------:|:---------------:|:---------------:|:---------------:| | No log | 0.72 | 2 | 2.7601 | 0.11 | 0.1025 | 0.0675 | 0.0825 | 0.0625 | 0.0625 | | No log | 1.72 | 4 | 2.7312 | 0.115 | 0.1025 | 0.065 | 0.085 | 0.0625 | 0.0625 | | No log | 2.72 | 6 | 2.6966 | 0.1325 | 0.1025 | 0.06 | 0.0975 | 0.0625 | 0.0625 | | No log | 3.72 | 8 | 2.6638 | 0.1725 | 0.1375 | 0.055 | 0.115 | 0.0625 | 0.0625 | | No log | 4.72 | 10 | 2.6223 | 0.195 | 0.1375 | 0.0575 | 0.1125 | 0.0625 | 0.0625 | | No log | 5.72 | 12 | 2.5770 | 0.215 | 0.13 | 0.08 | 0.115 | 0.0625 | 0.0625 | | No log | 6.72 | 14 | 2.5537 | 0.21 | 0.12 | 0.08 | 0.1125 | 0.0625 | 0.0625 | | No log | 7.72 | 16 | 2.5364 | 0.22 | 0.1275 | 0.09 | 0.1175 | 0.0625 | 0.0625 | | No log | 8.72 | 18 | 2.5008 | 0.2475 | 0.125 | 0.095 | 0.12 | 0.0625 | 0.0625 | | No log | 9.72 | 20 | 2.4477 | 0.2675 | 0.115 | 0.0925 | 0.115 | 0.0625 | 0.0625 | | No log | 10.72 | 22 | 2.3972 | 0.3075 | 0.115 | 0.12 | 0.1175 | 0.0625 | 0.0625 | | No log | 11.72 | 24 | 2.3565 | 0.32 | 0.1125 | 0.11 | 0.1125 | 0.0625 | 0.0625 | | No log | 12.72 | 26 | 2.2957 | 0.3425 | 0.1075 | 0.115 | 0.115 | 0.0625 | 0.0625 | | No log | 13.72 | 28 | 2.2355 | 0.3575 | 0.105 | 0.115 | 0.1175 | 0.0625 | 0.0625 | | No log | 14.72 | 30 | 2.1916 | 0.3625 | 0.1075 | 0.125 | 0.1275 | 0.0625 | 0.0625 | | No log | 15.72 | 32 | 2.1467 | 0.3825 | 0.1075 | 0.13 | 0.1225 | 0.0625 | 0.0625 | | No log | 16.72 | 34 | 2.0775 | 0.405 | 0.1075 | 0.1375 | 0.1225 | 0.0625 | 0.0625 | | No log | 17.72 | 36 | 2.0176 | 0.435 | 0.1125 | 0.1375 | 0.1225 | 0.0625 | 0.0625 | | No log | 18.72 | 38 | 1.9539 | 0.4725 | 0.115 | 0.1375 | 0.1225 | 0.0625 | 0.0625 | | No log | 19.72 | 40 | 1.9007 | 0.485 | 0.105 | 0.14 | 0.1225 | 0.0625 | 0.0625 | | No log | 20.72 | 42 | 1.8501 | 0.52 | 0.1075 | 0.14 | 0.1275 | 0.0625 | 0.0625 | | No log | 21.72 | 44 | 1.7795 | 0.5475 | 0.1075 | 0.14 | 0.125 | 0.0625 | 0.0625 | | No log | 22.72 | 46 | 1.7139 | 0.565 | 0.11 | 0.14 | 0.1275 | 0.0625 | 0.0625 | | No log | 23.72 | 48 | 1.6892 | 0.57 | 0.1125 | 0.14 | 0.13 | 0.0625 | 0.0625 | | No log | 24.72 | 50 | 1.6345 | 0.5875 | 0.11 | 0.1425 | 0.1275 | 0.0625 | 0.0625 | | No log | 25.72 | 52 | 1.5737 | 0.5975 | 0.1125 | 0.1475 | 0.1275 | 0.0625 | 0.0625 | | No log | 26.72 | 54 | 1.5422 | 0.6 | 0.1125 | 0.1475 | 0.135 | 0.0625 | 0.0625 | | No log | 27.72 | 56 | 1.5227 | 0.6125 | 0.115 | 0.1475 | 0.1375 | 0.0625 | 0.0625 | | No log | 28.72 | 58 | 1.4674 | 0.64 | 0.115 | 0.1475 | 0.1425 | 0.0625 | 0.0625 | | No log | 29.72 | 60 | 1.4152 | 0.65 | 0.115 | 0.1475 | 0.1425 | 0.0625 | 0.0625 | | No log | 30.72 | 62 | 1.4002 | 0.6575 | 0.115 | 0.1475 | 0.145 | 0.0625 | 0.0625 | | No log | 31.72 | 64 | 1.3922 | 0.6625 | 0.115 | 0.145 | 0.145 | 0.0625 | 0.0625 | | No log | 32.72 | 66 | 1.3489 | 0.6725 | 0.115 | 0.145 | 0.1475 | 0.0625 | 0.0625 | | No log | 33.72 | 68 | 1.3166 | 0.68 | 0.115 | 0.1475 | 0.1475 | 0.0625 | 0.0625 | | No log | 34.72 | 70 | 1.3028 | 0.685 | 0.1125 | 0.1475 | 0.1475 | 0.0625 | 0.0625 | | No log | 35.72 | 72 | 1.2779 | 0.6975 | 0.1125 | 0.1475 | 0.1475 | 0.0625 | 0.0625 | | No log | 36.72 | 74 | 1.2494 | 0.705 | 0.1125 | 0.1475 | 0.15 | 0.0625 | 0.0625 | | No log | 37.72 | 76 | 1.2366 | 0.7025 | 0.1125 | 0.1475 | 0.15 | 0.0625 | 0.0625 | | No log | 38.72 | 78 | 1.2214 | 0.705 | 0.1125 | 0.15 | 0.1525 | 0.0625 | 0.0625 | | No log | 39.72 | 80 | 1.1999 | 0.7175 | 0.1125 | 0.1525 | 0.1525 | 0.0625 | 0.0625 | | No log | 40.72 | 82 | 1.1793 | 0.7125 | 0.1125 | 0.1525 | 0.1575 | 0.0625 | 0.0625 | | No log | 41.72 | 84 | 1.1680 | 0.7225 | 0.1125 | 0.1525 | 0.1575 | 0.0625 | 0.0625 | | No log | 42.72 | 86 | 1.1625 | 0.7225 | 0.1125 | 0.1525 | 0.155 | 0.0625 | 0.0625 | | No log | 43.72 | 88 | 1.1471 | 0.7175 | 0.1125 | 0.1525 | 0.1575 | 0.0625 | 0.0625 | | No log | 44.72 | 90 | 1.1232 | 0.7275 | 0.1125 | 0.1525 | 0.1625 | 0.0625 | 0.0625 | | No log | 45.72 | 92 | 1.1188 | 0.7275 | 0.1125 | 0.1525 | 0.1625 | 0.0625 | 0.0625 | | No log | 46.72 | 94 | 1.1196 | 0.7275 | 0.1125 | 0.1525 | 0.1625 | 0.0625 | 0.0625 | | No log | 47.72 | 96 | 1.1133 | 0.725 | 0.1125 | 0.15 | 0.1625 | 0.0625 | 0.0625 | | No log | 48.72 | 98 | 1.1104 | 0.725 | 0.115 | 0.15 | 0.1625 | 0.0625 | 0.0625 | | No log | 49.72 | 100 | 1.1047 | 0.73 | 0.115 | 0.15 | 0.165 | 0.0625 | 0.0625 | | No log | 50.72 | 102 | 1.0973 | 0.7225 | 0.115 | 0.1525 | 0.17 | 0.0625 | 0.0625 | | No log | 51.72 | 104 | 1.0866 | 0.7225 | 0.115 | 0.1525 | 0.175 | 0.0625 | 0.0625 | | No log | 52.72 | 106 | 1.0845 | 0.73 | 0.1125 | 0.1525 | 0.1725 | 0.0625 | 0.0625 | | No log | 53.72 | 108 | 1.0836 | 0.7275 | 0.1125 | 0.1525 | 0.1725 | 0.0625 | 0.0625 | | No log | 54.72 | 110 | 1.0822 | 0.7225 | 0.1125 | 0.1525 | 0.1725 | 0.0625 | 0.0625 | | No log | 55.72 | 112 | 1.0808 | 0.7275 | 0.1125 | 0.1525 | 0.18 | 0.0625 | 0.0625 | | No log | 56.72 | 114 | 1.0766 | 0.725 | 0.1125 | 0.1525 | 0.18 | 0.0625 | 0.0625 | | No log | 57.72 | 116 | 1.0738 | 0.73 | 0.1125 | 0.1525 | 0.1825 | 0.0625 | 0.0625 | | No log | 58.72 | 118 | 1.0721 | 0.7275 | 0.1125 | 0.1525 | 0.185 | 0.0625 | 0.0625 | | No log | 59.72 | 120 | 1.0715 | 0.7275 | 0.1125 | 0.1525 | 0.185 | 0.0625 | 0.0625 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1.post200 - Datasets 2.9.0 - Tokenizers 0.13.2