Jacques2207 commited on
Commit
3f54570
1 Parent(s): 96f4f86

End of training

Browse files
README.md CHANGED
@@ -14,15 +14,13 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0174
18
- - Item: {'precision': 0.946751863684771, 'recall': 0.9348054679284963, 'f1': 0.9407407407407407, 'number': 951}
19
- - Aption: {'precision': 0.9266211604095563, 'recall': 0.9225280326197758, 'f1': 0.9245700664055849, 'number': 2943}
20
- - Ootnote: {'precision': 0.841726618705036, 'recall': 0.8068965517241379, 'f1': 0.823943661971831, 'number': 145}
21
- - Ormula: {'precision': 0.9741568112133158, 'recall': 0.9754385964912281, 'f1': 0.9747972824895901, 'number': 2280}
22
- - Overall Precision: 0.9450
23
- - Overall Recall: 0.9408
24
- - Overall F1: 0.9429
25
- - Overall Accuracy: 0.9980
26
 
27
  ## Model description
28
 
@@ -51,18 +49,18 @@ The following hyperparameters were used during training:
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | Item | Aption | Ootnote | Ormula | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
- |:-------------:|:-----:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
- | 0.0169 | 1.0 | 8507 | 0.0149 | {'precision': 0.8774703557312253, 'recall': 0.9337539432176656, 'f1': 0.9047376464595008, 'number': 951} | {'precision': 0.8646690813302601, 'recall': 0.8922867821950391, 'f1': 0.8782608695652173, 'number': 2943} | {'precision': 0.643312101910828, 'recall': 0.696551724137931, 'f1': 0.6688741721854304, 'number': 145} | {'precision': 0.951314088754847, 'recall': 0.968421052631579, 'f1': 0.9597913497065856, 'number': 2280} | 0.8921 | 0.9215 | 0.9066 | 0.9970 |
57
- | 0.0073 | 2.0 | 17014 | 0.0120 | {'precision': 0.9236326109391125, 'recall': 0.9411146161934806, 'f1': 0.9322916666666667, 'number': 951} | {'precision': 0.9089376053962901, 'recall': 0.9157322460074754, 'f1': 0.9123222748815166, 'number': 2943} | {'precision': 0.7913669064748201, 'recall': 0.7586206896551724, 'f1': 0.7746478873239436, 'number': 145} | {'precision': 0.9669708822251195, 'recall': 0.9758771929824561, 'f1': 0.9714036236629557, 'number': 2280} | 0.9296 | 0.9376 | 0.9336 | 0.9978 |
58
- | 0.0058 | 3.0 | 25521 | 0.0126 | {'precision': 0.9508379888268157, 'recall': 0.8948475289169295, 'f1': 0.9219934994582882, 'number': 951} | {'precision': 0.9105662983425414, 'recall': 0.8960244648318043, 'f1': 0.9032368556259633, 'number': 2943} | {'precision': 0.8, 'recall': 0.7724137931034483, 'f1': 0.7859649122807018, 'number': 145} | {'precision': 0.9717314487632509, 'recall': 0.9649122807017544, 'f1': 0.9683098591549295, 'number': 2280} | 0.9362 | 0.9179 | 0.9270 | 0.9978 |
59
- | 0.0039 | 4.0 | 34028 | 0.0123 | {'precision': 0.9646522234891676, 'recall': 0.889589905362776, 'f1': 0.925601750547046, 'number': 951} | {'precision': 0.913656690746474, 'recall': 0.9024804621134896, 'f1': 0.908034188034188, 'number': 2943} | {'precision': 0.738255033557047, 'recall': 0.7586206896551724, 'f1': 0.7482993197278912, 'number': 145} | {'precision': 0.9753086419753086, 'recall': 0.9701754385964912, 'f1': 0.9727352682497802, 'number': 2280} | 0.9392 | 0.9217 | 0.9304 | 0.9978 |
60
- | 0.0029 | 5.0 | 42535 | 0.0133 | {'precision': 0.9404761904761905, 'recall': 0.9137749737118822, 'f1': 0.9269333333333334, 'number': 951} | {'precision': 0.9182130584192439, 'recall': 0.90791709140333, 'f1': 0.9130360498889458, 'number': 2943} | {'precision': 0.8273381294964028, 'recall': 0.7931034482758621, 'f1': 0.8098591549295774, 'number': 145} | {'precision': 0.9727472527472527, 'recall': 0.9706140350877193, 'f1': 0.9716794731064764, 'number': 2280} | 0.9393 | 0.9288 | 0.9340 | 0.9979 |
61
- | 0.0022 | 6.0 | 51042 | 0.0148 | {'precision': 0.9139896373056995, 'recall': 0.9274447949526814, 'f1': 0.9206680584551148, 'number': 951} | {'precision': 0.9104477611940298, 'recall': 0.9119945633707102, 'f1': 0.911220505856391, 'number': 2943} | {'precision': 0.8226950354609929, 'recall': 0.8, 'f1': 0.8111888111888113, 'number': 145} | {'precision': 0.9750765194578049, 'recall': 0.9780701754385965, 'f1': 0.976571053207795, 'number': 2280} | 0.9323 | 0.9356 | 0.9340 | 0.9976 |
62
- | 0.0016 | 7.0 | 59549 | 0.0170 | {'precision': 0.9418729817007535, 'recall': 0.9200841219768665, 'f1': 0.9308510638297872, 'number': 951} | {'precision': 0.9165808444902163, 'recall': 0.9072375127420998, 'f1': 0.9118852459016393, 'number': 2943} | {'precision': 0.8405797101449275, 'recall': 0.8, 'f1': 0.8197879858657243, 'number': 145} | {'precision': 0.9752102700309871, 'recall': 0.9662280701754385, 'f1': 0.9706983917162371, 'number': 2280} | 0.9399 | 0.9280 | 0.9339 | 0.9977 |
63
- | 0.0013 | 8.0 | 68056 | 0.0187 | {'precision': 0.9455709711846318, 'recall': 0.9316508937960042, 'f1': 0.9385593220338984, 'number': 951} | {'precision': 0.9238387978142076, 'recall': 0.9191301393136255, 'f1': 0.9214784534150912, 'number': 2943} | {'precision': 0.9047619047619048, 'recall': 0.7862068965517242, 'f1': 0.8413284132841328, 'number': 145} | {'precision': 0.9723562966213252, 'recall': 0.9719298245614035, 'f1': 0.9721430138188198, 'number': 2280} | 0.9443 | 0.9370 | 0.9407 | 0.9979 |
64
- | 0.0009 | 9.0 | 76563 | 0.0169 | {'precision': 0.9375, 'recall': 0.9305993690851735, 'f1': 0.9340369393139841, 'number': 951} | {'precision': 0.9234449760765551, 'recall': 0.9181107713217805, 'f1': 0.9207701482364968, 'number': 2943} | {'precision': 0.8656716417910447, 'recall': 0.8, 'f1': 0.8315412186379928, 'number': 145} | {'precision': 0.9750328515111695, 'recall': 0.9763157894736842, 'f1': 0.9756738987508219, 'number': 2280} | 0.9431 | 0.9383 | 0.9407 | 0.9979 |
65
- | 0.0008 | 10.0 | 85070 | 0.0174 | {'precision': 0.946751863684771, 'recall': 0.9348054679284963, 'f1': 0.9407407407407407, 'number': 951} | {'precision': 0.9266211604095563, 'recall': 0.9225280326197758, 'f1': 0.9245700664055849, 'number': 2943} | {'precision': 0.841726618705036, 'recall': 0.8068965517241379, 'f1': 0.823943661971831, 'number': 145} | {'precision': 0.9741568112133158, 'recall': 0.9754385964912281, 'f1': 0.9747972824895901, 'number': 2280} | 0.9450 | 0.9408 | 0.9429 | 0.9980 |
66
 
67
 
68
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlmv3-base](https://huggingface.co/microsoft/layoutlmv3-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.0158
18
+ - Aption: {'precision': 0.9251446070091868, 'recall': 0.9238871899422358, 'f1': 0.9245154709282557, 'number': 2943}
19
+ - Ootnote: {'precision': 0.9455411844792376, 'recall': 0.9442556084296397, 'f1': 0.9448979591836736, 'number': 2942}
20
+ - Overall Precision: 0.9353
21
+ - Overall Recall: 0.9341
22
+ - Overall F1: 0.9347
23
+ - Overall Accuracy: 0.9982
 
 
24
 
25
  ## Model description
26
 
 
49
 
50
  ### Training results
51
 
52
+ | Training Loss | Epoch | Step | Validation Loss | Aption | Ootnote | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
54
+ | 0.0152 | 1.0 | 8507 | 0.0147 | {'precision': 0.863031914893617, 'recall': 0.8820931022765885, 'f1': 0.8724584103512015, 'number': 2943} | {'precision': 0.902593085106383, 'recall': 0.9228416043507818, 'f1': 0.9126050420168068, 'number': 2942} | 0.8828 | 0.9025 | 0.8925 | 0.9969 |
55
+ | 0.0067 | 2.0 | 17014 | 0.0128 | {'precision': 0.9206239168110919, 'recall': 0.9024804621134896, 'f1': 0.911461908030199, 'number': 2943} | {'precision': 0.9459084604715673, 'recall': 0.9272603670972128, 'f1': 0.936491589426708, 'number': 2942} | 0.9333 | 0.9149 | 0.9240 | 0.9979 |
56
+ | 0.0049 | 3.0 | 25521 | 0.0153 | {'precision': 0.9005291005291005, 'recall': 0.8674821610601428, 'f1': 0.8836967808930426, 'number': 2943} | {'precision': 0.9435426958362738, 'recall': 0.9089055064581917, 'f1': 0.9259002770083102, 'number': 2942} | 0.9220 | 0.8882 | 0.9048 | 0.9971 |
57
+ | 0.0037 | 4.0 | 34028 | 0.0110 | {'precision': 0.9221803222488858, 'recall': 0.9140332993544003, 'f1': 0.9180887372013652, 'number': 2943} | {'precision': 0.946159122085048, 'recall': 0.9377974167233175, 'f1': 0.9419597132127007, 'number': 2942} | 0.9342 | 0.9259 | 0.9300 | 0.9981 |
58
+ | 0.0025 | 5.0 | 42535 | 0.0110 | {'precision': 0.9253680246490927, 'recall': 0.9184505606523955, 'f1': 0.9218963165075034, 'number': 2943} | {'precision': 0.9455665867853474, 'recall': 0.938817131203263, 'f1': 0.9421797714480641, 'number': 2942} | 0.9355 | 0.9286 | 0.9320 | 0.9981 |
59
+ | 0.0021 | 6.0 | 51042 | 0.0137 | {'precision': 0.9104477611940298, 'recall': 0.9119945633707102, 'f1': 0.911220505856391, 'number': 2943} | {'precision': 0.9331523583305056, 'recall': 0.9347382732834806, 'f1': 0.9339446425539141, 'number': 2942} | 0.9218 | 0.9234 | 0.9226 | 0.9978 |
60
+ | 0.0012 | 7.0 | 59549 | 0.0133 | {'precision': 0.9154399178363574, 'recall': 0.90859667006456, 'f1': 0.912005457025921, 'number': 2943} | {'precision': 0.9397260273972603, 'recall': 0.9326988443235894, 'f1': 0.9361992494029341, 'number': 2942} | 0.9276 | 0.9206 | 0.9241 | 0.9981 |
61
+ | 0.0013 | 8.0 | 68056 | 0.0194 | {'precision': 0.9192886456908345, 'recall': 0.9133537206931702, 'f1': 0.9163115732060677, 'number': 2943} | {'precision': 0.9442353746151214, 'recall': 0.938137321549966, 'f1': 0.9411764705882352, 'number': 2942} | 0.9318 | 0.9257 | 0.9287 | 0.9979 |
62
+ | 0.0007 | 9.0 | 76563 | 0.0143 | {'precision': 0.9239945466939332, 'recall': 0.9211688752973156, 'f1': 0.9225795473881231, 'number': 2943} | {'precision': 0.9457892942379816, 'recall': 0.9428959891230455, 'f1': 0.9443404255319149, 'number': 2942} | 0.9349 | 0.9320 | 0.9335 | 0.9982 |
63
+ | 0.0004 | 10.0 | 85070 | 0.0158 | {'precision': 0.9251446070091868, 'recall': 0.9238871899422358, 'f1': 0.9245154709282557, 'number': 2943} | {'precision': 0.9455411844792376, 'recall': 0.9442556084296397, 'f1': 0.9448979591836736, 'number': 2942} | 0.9353 | 0.9341 | 0.9347 | 0.9982 |
64
 
65
 
66
  ### Framework versions
logs/events.out.tfevents.1679914872.138-2-233-57.82015.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9af3e74125ffb44177ee40f586b6756bd5241904024dad76238bd6575cf87c72
3
- size 11329
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fecca544221cc38cbba6d1edf0576ac8aedde7cacea1c3c681e2350c942d6040
3
+ size 11689