Edit model card

layoutlmv3-base-ner

This model is a fine-tuned version of microsoft/layoutlmv3-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1562
  • Footer: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5}
  • Header: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
  • Able: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5}
  • Aption: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
  • Ext: {'precision': 0.06153846153846154, 'recall': 0.4, 'f1': 0.10666666666666667, 'number': 10}
  • Overall Precision: 0.0310
  • Overall Recall: 0.1739
  • Overall F1: 0.0526
  • Overall Accuracy: 0.8882

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Footer Header Able Aption Ext Overall Precision Overall Recall Overall F1 Overall Accuracy
2.0796 1.0 5 1.4462 {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} {'precision': 0.05063291139240506, 'recall': 0.4, 'f1': 0.0898876404494382, 'number': 10} 0.0255 0.1739 0.0444 0.8518
1.2478 2.0 10 1.1562 {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} {'precision': 0.06153846153846154, 'recall': 0.4, 'f1': 0.10666666666666667, 'number': 10} 0.0310 0.1739 0.0526 0.8882

Framework versions

  • Transformers 4.26.0
  • Pytorch 1.12.1
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
2