Edit model card

Output_LayoutLMv3_v2

This model is a fine-tuned version of microsoft/layoutlmv3-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1240
  • Precision: 0.8174
  • Recall: 0.8319
  • F1: 0.8246
  • Accuracy: 0.9762

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-07
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 3500

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 2.27 100 0.5286 0.0 0.0 0.0 0.8867
No log 4.55 200 0.4075 0.0 0.0 0.0 0.8867
No log 6.82 300 0.3231 0.2258 0.0310 0.0545 0.8933
No log 9.09 400 0.2612 0.5546 0.2920 0.3826 0.9210
0.4595 11.36 500 0.2246 0.5897 0.4071 0.4817 0.9295
0.4595 13.64 600 0.2004 0.6869 0.6018 0.6415 0.9476
0.4595 15.91 700 0.1866 0.7019 0.6460 0.6728 0.9514
0.4595 18.18 800 0.1712 0.7419 0.7124 0.7269 0.96
0.4595 20.45 900 0.1599 0.7647 0.7478 0.7562 0.9638
0.1593 22.73 1000 0.1568 0.7729 0.7832 0.7780 0.9686
0.1593 25.0 1100 0.1476 0.7686 0.7788 0.7736 0.9686
0.1593 27.27 1200 0.1395 0.7930 0.7965 0.7947 0.9714
0.1593 29.55 1300 0.1372 0.8 0.8142 0.8070 0.9733
0.1593 31.82 1400 0.1356 0.8035 0.8142 0.8088 0.9743
0.0987 34.09 1500 0.1326 0.7939 0.8009 0.7974 0.9714
0.0987 36.36 1600 0.1292 0.7939 0.8009 0.7974 0.9714
0.0987 38.64 1700 0.1300 0.8017 0.8230 0.8122 0.9743
0.0987 40.91 1800 0.1260 0.8062 0.8097 0.8079 0.9724
0.0987 43.18 1900 0.1244 0.8017 0.8230 0.8122 0.9743
0.0689 45.45 2000 0.1228 0.8150 0.8186 0.8168 0.9752
0.0689 47.73 2100 0.1230 0.8087 0.8230 0.8158 0.9752
0.0689 50.0 2200 0.1225 0.8114 0.8186 0.8150 0.9743
0.0689 52.27 2300 0.1226 0.8114 0.8186 0.8150 0.9743
0.0689 54.55 2400 0.1237 0.8174 0.8319 0.8246 0.9762
0.0545 56.82 2500 0.1234 0.8122 0.8230 0.8176 0.9752
0.0545 59.09 2600 0.1240 0.8122 0.8230 0.8176 0.9752
0.0545 61.36 2700 0.1242 0.8122 0.8230 0.8176 0.9752
0.0545 63.64 2800 0.1241 0.8122 0.8230 0.8176 0.9752
0.0545 65.91 2900 0.1253 0.8190 0.8407 0.8297 0.9771
0.0491 68.18 3000 0.1235 0.8114 0.8186 0.8150 0.9743
0.0491 70.45 3100 0.1236 0.8166 0.8274 0.8220 0.9752
0.0491 72.73 3200 0.1231 0.8166 0.8274 0.8220 0.9752
0.0491 75.0 3300 0.1239 0.8190 0.8407 0.8297 0.9771
0.0491 77.27 3400 0.1241 0.8190 0.8407 0.8297 0.9771
0.0442 79.55 3500 0.1240 0.8174 0.8319 0.8246 0.9762

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
160
Safetensors
Model size
356M params
Tensor type
F32
·

Finetuned from

Dataset used to train Noureddinesa/Output_LayoutLMv3_v2

Space using Noureddinesa/Output_LayoutLMv3_v2 1