File size: 9,190 Bytes
d16d42c 4bbcb5d d16d42c c793c22 d16d42c c793c22 d16d42c fa6b01d d16d42c b9ec4e9 d16d42c fa6b01d d16d42c b9ec4e9 4bbcb5d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 |
---
library_name: transformers
license: mit
base_model: pabloma09/layoutlm-funsd
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# layoutlm-funsd
This model is a fine-tuned version of [pabloma09/layoutlm-funsd](https://huggingface.co/pabloma09/layoutlm-funsd) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5379
- Eader: {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57}
- Nswer: {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141}
- Uestion: {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161}
- Overall Precision: 0.7235
- Overall Recall: 0.6852
- Overall F1: 0.7039
- Overall Accuracy: 0.9016
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.0751 | 1.0 | 12 | 0.4989 | {'precision': 0.5740740740740741, 'recall': 0.543859649122807, 'f1': 0.5585585585585585, 'number': 57} | {'precision': 0.673202614379085, 'recall': 0.7304964539007093, 'f1': 0.7006802721088436, 'number': 141} | {'precision': 0.6666666666666666, 'recall': 0.6708074534161491, 'f1': 0.6687306501547988, 'number': 161} | 0.6558 | 0.6741 | 0.6648 | 0.8675 |
| 0.0681 | 2.0 | 24 | 0.4233 | {'precision': 0.6739130434782609, 'recall': 0.543859649122807, 'f1': 0.6019417475728156, 'number': 57} | {'precision': 0.7394366197183099, 'recall': 0.7446808510638298, 'f1': 0.7420494699646644, 'number': 141} | {'precision': 0.7044025157232704, 'recall': 0.6956521739130435, 'f1': 0.7, 'number': 161} | 0.7147 | 0.6908 | 0.7025 | 0.9004 |
| 0.0499 | 3.0 | 36 | 0.4571 | {'precision': 0.775, 'recall': 0.543859649122807, 'f1': 0.6391752577319588, 'number': 57} | {'precision': 0.7083333333333334, 'recall': 0.723404255319149, 'f1': 0.7157894736842105, 'number': 141} | {'precision': 0.73125, 'recall': 0.7267080745341615, 'f1': 0.7289719626168223, 'number': 161} | 0.7267 | 0.6964 | 0.7112 | 0.8998 |
| 0.037 | 4.0 | 48 | 0.4636 | {'precision': 0.7045454545454546, 'recall': 0.543859649122807, 'f1': 0.613861386138614, 'number': 57} | {'precision': 0.7142857142857143, 'recall': 0.7446808510638298, 'f1': 0.7291666666666666, 'number': 141} | {'precision': 0.7222222222222222, 'recall': 0.7267080745341615, 'f1': 0.7244582043343654, 'number': 161} | 0.7167 | 0.7047 | 0.7107 | 0.9016 |
| 0.0329 | 5.0 | 60 | 0.5128 | {'precision': 0.6530612244897959, 'recall': 0.5614035087719298, 'f1': 0.6037735849056605, 'number': 57} | {'precision': 0.697986577181208, 'recall': 0.7375886524822695, 'f1': 0.7172413793103447, 'number': 141} | {'precision': 0.6706586826347305, 'recall': 0.6956521739130435, 'f1': 0.6829268292682926, 'number': 161} | 0.6795 | 0.6908 | 0.6851 | 0.8880 |
| 0.0263 | 6.0 | 72 | 0.5192 | {'precision': 0.6904761904761905, 'recall': 0.5087719298245614, 'f1': 0.5858585858585859, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7484276729559748, 'recall': 0.7391304347826086, 'f1': 0.7437500000000001, 'number': 161} | 0.7289 | 0.6964 | 0.7123 | 0.8995 |
| 0.023 | 7.0 | 84 | 0.5452 | {'precision': 0.6976744186046512, 'recall': 0.5263157894736842, 'f1': 0.6, 'number': 57} | {'precision': 0.7202797202797203, 'recall': 0.7304964539007093, 'f1': 0.7253521126760565, 'number': 141} | {'precision': 0.7, 'recall': 0.6956521739130435, 'f1': 0.6978193146417445, 'number': 161} | 0.7081 | 0.6825 | 0.6950 | 0.8956 |
| 0.0205 | 8.0 | 96 | 0.5398 | {'precision': 0.6666666666666666, 'recall': 0.5614035087719298, 'f1': 0.6095238095238096, 'number': 57} | {'precision': 0.7083333333333334, 'recall': 0.723404255319149, 'f1': 0.7157894736842105, 'number': 141} | {'precision': 0.7151898734177216, 'recall': 0.7018633540372671, 'f1': 0.7084639498432601, 'number': 161} | 0.7057 | 0.6880 | 0.6968 | 0.8971 |
| 0.0182 | 9.0 | 108 | 0.5025 | {'precision': 0.62, 'recall': 0.543859649122807, 'f1': 0.5794392523364487, 'number': 57} | {'precision': 0.7482014388489209, 'recall': 0.7375886524822695, 'f1': 0.7428571428571428, 'number': 141} | {'precision': 0.7088607594936709, 'recall': 0.6956521739130435, 'f1': 0.7021943573667712, 'number': 161} | 0.7118 | 0.6880 | 0.6997 | 0.9046 |
| 0.0175 | 10.0 | 120 | 0.5017 | {'precision': 0.6888888888888889, 'recall': 0.543859649122807, 'f1': 0.6078431372549019, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7133757961783439, 'recall': 0.6956521739130435, 'f1': 0.7044025157232704, 'number': 161} | 0.7122 | 0.6825 | 0.6970 | 0.9031 |
| 0.0157 | 11.0 | 132 | 0.5034 | {'precision': 0.7272727272727273, 'recall': 0.5614035087719298, 'f1': 0.6336633663366337, 'number': 57} | {'precision': 0.7357142857142858, 'recall': 0.7304964539007093, 'f1': 0.7330960854092528, 'number': 141} | {'precision': 0.7243589743589743, 'recall': 0.7018633540372671, 'f1': 0.7129337539432177, 'number': 161} | 0.7294 | 0.6908 | 0.7096 | 0.9037 |
| 0.0151 | 12.0 | 144 | 0.5181 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161} | 0.7235 | 0.6852 | 0.7039 | 0.9040 |
| 0.0122 | 13.0 | 156 | 0.5368 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7394366197183099, 'recall': 0.7446808510638298, 'f1': 0.7420494699646644, 'number': 141} | {'precision': 0.7261146496815286, 'recall': 0.7080745341614907, 'f1': 0.7169811320754716, 'number': 161} | 0.7310 | 0.6964 | 0.7133 | 0.9019 |
| 0.0114 | 14.0 | 168 | 0.5372 | {'precision': 0.7272727272727273, 'recall': 0.5614035087719298, 'f1': 0.6336633663366337, 'number': 57} | {'precision': 0.7272727272727273, 'recall': 0.7375886524822695, 'f1': 0.7323943661971831, 'number': 141} | {'precision': 0.7197452229299363, 'recall': 0.7018633540372671, 'f1': 0.7106918238993711, 'number': 161} | 0.7238 | 0.6936 | 0.7084 | 0.9022 |
| 0.0126 | 15.0 | 180 | 0.5379 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161} | 0.7235 | 0.6852 | 0.7039 | 0.9016 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
|