layoutlmv3-finetuned-invoice

This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8510
  • Precision: 0.9058
  • Recall: 0.9175
  • F1: 0.9116
  • Accuracy: 0.8556

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 2000

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.3333 100 0.6411 0.7779 0.8316 0.8038 0.7951
No log 2.6667 200 0.5235 0.8209 0.8629 0.8414 0.8238
No log 4.0 300 0.5130 0.8738 0.9081 0.8906 0.8504
No log 5.3333 400 0.5742 0.8848 0.9081 0.8963 0.8431
0.5233 6.6667 500 0.6276 0.8610 0.8927 0.8766 0.8374
0.5233 8.0 600 0.6887 0.8818 0.9041 0.8928 0.8357
0.5233 9.3333 700 0.6323 0.8930 0.9165 0.9046 0.8628
0.5233 10.6667 800 0.6644 0.8878 0.9195 0.9034 0.8538
0.5233 12.0 900 0.7365 0.9138 0.9210 0.9174 0.8580
0.1181 13.3333 1000 0.7774 0.8939 0.9210 0.9073 0.8549
0.1181 14.6667 1100 0.8265 0.9090 0.9175 0.9132 0.8557
0.1181 16.0 1200 0.8112 0.9023 0.9265 0.9142 0.8546
0.1181 17.3333 1300 0.8212 0.9075 0.9160 0.9117 0.8596
0.1181 18.6667 1400 0.8931 0.8999 0.9151 0.9074 0.8509
0.0443 20.0 1500 0.8510 0.9058 0.9175 0.9116 0.8556
0.0443 21.3333 1600 0.8318 0.9016 0.9235 0.9124 0.8612
0.0443 22.6667 1700 0.8783 0.9065 0.9146 0.9105 0.8519
0.0443 24.0 1800 0.8964 0.9023 0.9126 0.9074 0.8527
0.0443 25.3333 1900 0.8890 0.9054 0.9175 0.9114 0.8580
0.0205 26.6667 2000 0.8992 0.9004 0.9165 0.9084 0.8552

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.5.0+cu121
  • Datasets 3.0.2
  • Tokenizers 0.20.1
Downloads last month
6
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LRavi98/layoutlmv3-finetuned-invoice

Finetuned
(219)
this model