Sebabrata commited on
Commit
1796252
1 Parent(s): 22ff935

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +133 -0
README.md ADDED
@@ -0,0 +1,133 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: lmv2-g-invoice-993-doc-08-02
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # lmv2-g-invoice-993-doc-08-02
14
+
15
+ This model is a fine-tuned version of [microsoft/layoutlmv2-base-uncased](https://huggingface.co/microsoft/layoutlmv2-base-uncased) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.3517
18
+ - Due Date Precision: 0.9277
19
+ - Due Date Recall: 0.875
20
+ - Due Date F1: 0.9006
21
+ - Due Date Number: 88
22
+ - Invoice Date Precision: 0.8182
23
+ - Invoice Date Recall: 0.9172
24
+ - Invoice Date F1: 0.8649
25
+ - Invoice Date Number: 157
26
+ - Invoice Id Precision: 0.8993
27
+ - Invoice Id Recall: 0.8741
28
+ - Invoice Id F1: 0.8865
29
+ - Invoice Id Number: 143
30
+ - Payment Terms Precision: 0.5469
31
+ - Payment Terms Recall: 0.7143
32
+ - Payment Terms F1: 0.6195
33
+ - Payment Terms Number: 49
34
+ - Receiver Address Precision: 0.7249
35
+ - Receiver Address Recall: 0.7697
36
+ - Receiver Address F1: 0.7466
37
+ - Receiver Address Number: 178
38
+ - Receiver Name Precision: 0.8270
39
+ - Receiver Name Recall: 0.8596
40
+ - Receiver Name F1: 0.8430
41
+ - Receiver Name Number: 178
42
+ - Sub Total Precision: 0.8624
43
+ - Sub Total Recall: 0.8704
44
+ - Sub Total F1: 0.8664
45
+ - Sub Total Number: 108
46
+ - Supplier Address Precision: 0.7665
47
+ - Supplier Address Recall: 0.7711
48
+ - Supplier Address F1: 0.7688
49
+ - Supplier Address Number: 166
50
+ - Supplier Name Precision: 0.7567
51
+ - Supplier Name Recall: 0.8057
52
+ - Supplier Name F1: 0.7804
53
+ - Supplier Name Number: 247
54
+ - Tax Amount Precision: 0.8333
55
+ - Tax Amount Recall: 0.8209
56
+ - Tax Amount F1: 0.8271
57
+ - Tax Amount Number: 67
58
+ - Total Precision: 0.8061
59
+ - Total Recall: 0.7557
60
+ - Total F1: 0.7801
61
+ - Total Number: 176
62
+ - Overall Precision: 0.7970
63
+ - Overall Recall: 0.8221
64
+ - Overall F1: 0.8094
65
+ - Overall Accuracy: 0.9572
66
+
67
+ ## Model description
68
+
69
+ More information needed
70
+
71
+ ## Intended uses & limitations
72
+
73
+ More information needed
74
+
75
+ ## Training and evaluation data
76
+
77
+ More information needed
78
+
79
+ ## Training procedure
80
+
81
+ ### Training hyperparameters
82
+
83
+ The following hyperparameters were used during training:
84
+ - learning_rate: 4e-05
85
+ - train_batch_size: 1
86
+ - eval_batch_size: 1
87
+ - seed: 42
88
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
89
+ - lr_scheduler_type: constant
90
+ - num_epochs: 30
91
+
92
+ ### Training results
93
+
94
+ | Training Loss | Epoch | Step | Validation Loss | Due Date Precision | Due Date Recall | Due Date F1 | Due Date Number | Invoice Date Precision | Invoice Date Recall | Invoice Date F1 | Invoice Date Number | Invoice Id Precision | Invoice Id Recall | Invoice Id F1 | Invoice Id Number | Payment Terms Precision | Payment Terms Recall | Payment Terms F1 | Payment Terms Number | Receiver Address Precision | Receiver Address Recall | Receiver Address F1 | Receiver Address Number | Receiver Name Precision | Receiver Name Recall | Receiver Name F1 | Receiver Name Number | Sub Total Precision | Sub Total Recall | Sub Total F1 | Sub Total Number | Supplier Address Precision | Supplier Address Recall | Supplier Address F1 | Supplier Address Number | Supplier Name Precision | Supplier Name Recall | Supplier Name F1 | Supplier Name Number | Tax Amount Precision | Tax Amount Recall | Tax Amount F1 | Tax Amount Number | Total Precision | Total Recall | Total F1 | Total Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
95
+ |:-------------:|:-----:|:-----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:-----------------------:|:--------------------:|:----------------:|:--------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:---------------:|:------------:|:--------:|:------------:|:-----------------:|:--------------:|:----------:|:----------------:|
96
+ | 1.2159 | 1.0 | 794 | 0.5347 | 0.0 | 0.0 | 0.0 | 88 | 0.4828 | 0.8025 | 0.6029 | 157 | 0.5247 | 0.5944 | 0.5574 | 143 | 0.0 | 0.0 | 0.0 | 49 | 0.3738 | 0.4326 | 0.4010 | 178 | 0.3780 | 0.2697 | 0.3148 | 178 | 0.0 | 0.0 | 0.0 | 108 | 0.4375 | 0.5060 | 0.4693 | 166 | 0.4348 | 0.3239 | 0.3712 | 247 | 0.0 | 0.0 | 0.0 | 67 | 0.5278 | 0.1080 | 0.1792 | 176 | 0.4443 | 0.3333 | 0.3809 | 0.9071 |
97
+ | 0.4121 | 2.0 | 1588 | 0.3147 | 0.0 | 0.0 | 0.0 | 88 | 0.5353 | 0.9172 | 0.6761 | 157 | 0.7355 | 0.6224 | 0.6742 | 143 | 0.2245 | 0.4490 | 0.2993 | 49 | 0.5707 | 0.6573 | 0.6110 | 178 | 0.7457 | 0.7247 | 0.7350 | 178 | 0.7377 | 0.4167 | 0.5325 | 108 | 0.5802 | 0.7410 | 0.6508 | 166 | 0.6703 | 0.7490 | 0.7075 | 247 | 0.0 | 0.0 | 0.0 | 67 | 0.4639 | 0.8409 | 0.5980 | 176 | 0.5779 | 0.6435 | 0.6089 | 0.9340 |
98
+ | 0.2248 | 3.0 | 2382 | 0.2087 | 0.8519 | 0.7841 | 0.8166 | 88 | 0.7849 | 0.9299 | 0.8513 | 157 | 0.8182 | 0.8182 | 0.8182 | 143 | 0.5179 | 0.5918 | 0.5524 | 49 | 0.5799 | 0.7135 | 0.6398 | 178 | 0.8192 | 0.8146 | 0.8169 | 178 | 0.8022 | 0.6759 | 0.7337 | 108 | 0.5990 | 0.7470 | 0.6649 | 166 | 0.6522 | 0.7895 | 0.7143 | 247 | 0.8103 | 0.7015 | 0.752 | 67 | 0.7444 | 0.7614 | 0.7528 | 176 | 0.7107 | 0.7746 | 0.7412 | 0.9532 |
99
+ | 0.1303 | 4.0 | 3176 | 0.2286 | 0.8280 | 0.875 | 0.8508 | 88 | 0.8671 | 0.8726 | 0.8698 | 157 | 0.8 | 0.8392 | 0.8191 | 143 | 0.3976 | 0.6735 | 0.5 | 49 | 0.6474 | 0.6910 | 0.6685 | 178 | 0.8054 | 0.8371 | 0.8209 | 178 | 0.75 | 0.75 | 0.75 | 108 | 0.6467 | 0.6506 | 0.6486 | 166 | 0.7143 | 0.7895 | 0.7500 | 247 | 0.8333 | 0.7463 | 0.7874 | 67 | 0.7344 | 0.8011 | 0.7663 | 176 | 0.7318 | 0.7797 | 0.7550 | 0.9500 |
100
+ | 0.0814 | 5.0 | 3970 | 0.2354 | 0.8444 | 0.8636 | 0.8539 | 88 | 0.8780 | 0.9172 | 0.8972 | 157 | 0.8212 | 0.8671 | 0.8435 | 143 | 0.3908 | 0.6939 | 0.5 | 49 | 0.7174 | 0.7416 | 0.7293 | 178 | 0.8418 | 0.8371 | 0.8394 | 178 | 0.6935 | 0.7963 | 0.7414 | 108 | 0.7377 | 0.8133 | 0.7736 | 166 | 0.7118 | 0.8300 | 0.7664 | 247 | 0.6579 | 0.7463 | 0.6993 | 67 | 0.7553 | 0.8068 | 0.7802 | 176 | 0.7459 | 0.8202 | 0.7813 | 0.9545 |
101
+ | 0.0604 | 6.0 | 4764 | 0.2217 | 0.8333 | 0.9091 | 0.8696 | 88 | 0.875 | 0.8917 | 0.8833 | 157 | 0.8414 | 0.8531 | 0.8472 | 143 | 0.4848 | 0.6531 | 0.5565 | 49 | 0.6716 | 0.7697 | 0.7173 | 178 | 0.8098 | 0.8371 | 0.8232 | 178 | 0.8173 | 0.7870 | 0.8019 | 108 | 0.7098 | 0.8253 | 0.7632 | 166 | 0.7148 | 0.7611 | 0.7373 | 247 | 0.6786 | 0.8507 | 0.7550 | 67 | 0.7514 | 0.7898 | 0.7701 | 176 | 0.7518 | 0.8131 | 0.7812 | 0.9541 |
102
+ | 0.0478 | 7.0 | 5558 | 0.2268 | 0.8387 | 0.8864 | 0.8619 | 88 | 0.8286 | 0.9236 | 0.8735 | 157 | 0.8129 | 0.8811 | 0.8456 | 143 | 0.4384 | 0.6531 | 0.5246 | 49 | 0.6579 | 0.7022 | 0.6793 | 178 | 0.8258 | 0.8258 | 0.8258 | 178 | 0.8302 | 0.8148 | 0.8224 | 108 | 0.5957 | 0.6747 | 0.6328 | 166 | 0.6926 | 0.7206 | 0.7063 | 247 | 0.8529 | 0.8657 | 0.8593 | 67 | 0.8117 | 0.7102 | 0.7576 | 176 | 0.7416 | 0.7797 | 0.7602 | 0.9550 |
103
+ | 0.0361 | 8.0 | 6352 | 0.2785 | 0.6949 | 0.9318 | 0.7961 | 88 | 0.8305 | 0.9363 | 0.8802 | 157 | 0.8089 | 0.8881 | 0.8467 | 143 | 0.5441 | 0.7551 | 0.6325 | 49 | 0.6919 | 0.7697 | 0.7287 | 178 | 0.8315 | 0.8315 | 0.8315 | 178 | 0.7561 | 0.8611 | 0.8052 | 108 | 0.7253 | 0.7952 | 0.7586 | 166 | 0.6754 | 0.8340 | 0.7464 | 247 | 0.7887 | 0.8358 | 0.8116 | 67 | 0.7917 | 0.7557 | 0.7733 | 176 | 0.7438 | 0.8337 | 0.7862 | 0.9520 |
104
+ | 0.0283 | 9.0 | 7146 | 0.2838 | 0.8404 | 0.8977 | 0.8681 | 88 | 0.8412 | 0.9108 | 0.8746 | 157 | 0.8667 | 0.8182 | 0.8417 | 143 | 0.6066 | 0.7551 | 0.6727 | 49 | 0.7213 | 0.7416 | 0.7313 | 178 | 0.8644 | 0.8596 | 0.8620 | 178 | 0.8511 | 0.7407 | 0.7921 | 108 | 0.7135 | 0.7952 | 0.7521 | 166 | 0.7530 | 0.7530 | 0.7530 | 247 | 0.6522 | 0.8955 | 0.7547 | 67 | 0.8034 | 0.5341 | 0.6416 | 176 | 0.7801 | 0.7791 | 0.7796 | 0.9553 |
105
+ | 0.0253 | 10.0 | 7940 | 0.3362 | 0.7217 | 0.9432 | 0.8177 | 88 | 0.8882 | 0.9108 | 0.8994 | 157 | 0.8403 | 0.8462 | 0.8432 | 143 | 0.3980 | 0.7959 | 0.5306 | 49 | 0.6703 | 0.6966 | 0.6832 | 178 | 0.8042 | 0.8539 | 0.8283 | 178 | 0.8462 | 0.8148 | 0.8302 | 108 | 0.6667 | 0.8193 | 0.7351 | 166 | 0.7173 | 0.8219 | 0.7660 | 247 | 0.8060 | 0.8060 | 0.8060 | 67 | 0.7460 | 0.8011 | 0.7726 | 176 | 0.7384 | 0.8247 | 0.7791 | 0.9385 |
106
+ | 0.0201 | 11.0 | 8734 | 0.3310 | 0.8247 | 0.9091 | 0.8649 | 88 | 0.8820 | 0.9045 | 0.8931 | 157 | 0.8832 | 0.8462 | 0.8643 | 143 | 0.5072 | 0.7143 | 0.5932 | 49 | 0.7294 | 0.6966 | 0.7126 | 178 | 0.8314 | 0.8034 | 0.8171 | 178 | 0.8165 | 0.8241 | 0.8203 | 108 | 0.6618 | 0.8133 | 0.7297 | 166 | 0.7399 | 0.8178 | 0.7769 | 247 | 0.8281 | 0.7910 | 0.8092 | 67 | 0.75 | 0.7330 | 0.7414 | 176 | 0.7697 | 0.8048 | 0.7868 | 0.9529 |
107
+ | 0.0239 | 12.0 | 9528 | 0.2936 | 0.8736 | 0.8636 | 0.8686 | 88 | 0.8614 | 0.9108 | 0.8854 | 157 | 0.8955 | 0.8392 | 0.8664 | 143 | 0.5373 | 0.7347 | 0.6207 | 49 | 0.6818 | 0.7584 | 0.7181 | 178 | 0.8398 | 0.8539 | 0.8468 | 178 | 0.83 | 0.7685 | 0.7981 | 108 | 0.7529 | 0.7892 | 0.7706 | 166 | 0.7674 | 0.8016 | 0.7842 | 247 | 0.8966 | 0.7761 | 0.8320 | 67 | 0.7527 | 0.7784 | 0.7654 | 176 | 0.7869 | 0.8112 | 0.7989 | 0.9565 |
108
+ | 0.0229 | 13.0 | 10322 | 0.3042 | 0.8791 | 0.9091 | 0.8939 | 88 | 0.8735 | 0.9236 | 0.8978 | 157 | 0.8662 | 0.8601 | 0.8632 | 143 | 0.6613 | 0.8367 | 0.7387 | 49 | 0.7068 | 0.7584 | 0.7317 | 178 | 0.8324 | 0.8652 | 0.8485 | 178 | 0.8252 | 0.7870 | 0.8057 | 108 | 0.7278 | 0.7892 | 0.7572 | 166 | 0.7751 | 0.7814 | 0.7782 | 247 | 0.8621 | 0.7463 | 0.8000 | 67 | 0.7683 | 0.7159 | 0.7412 | 176 | 0.7938 | 0.8112 | 0.8024 | 0.9580 |
109
+ | 0.0165 | 14.0 | 11116 | 0.2715 | 0.9111 | 0.9318 | 0.9213 | 88 | 0.8802 | 0.9363 | 0.9074 | 157 | 0.8671 | 0.8671 | 0.8671 | 143 | 0.5211 | 0.7551 | 0.6167 | 49 | 0.7053 | 0.7528 | 0.7283 | 178 | 0.8115 | 0.8708 | 0.8401 | 178 | 0.9158 | 0.8056 | 0.8571 | 108 | 0.7196 | 0.8193 | 0.7662 | 166 | 0.7348 | 0.7854 | 0.7593 | 247 | 0.7733 | 0.8657 | 0.8169 | 67 | 0.7943 | 0.7898 | 0.7920 | 176 | 0.7836 | 0.8304 | 0.8064 | 0.9600 |
110
+ | 0.0221 | 15.0 | 11910 | 0.2866 | 0.8161 | 0.8068 | 0.8114 | 88 | 0.8720 | 0.9108 | 0.8910 | 157 | 0.8986 | 0.8671 | 0.8826 | 143 | 0.4722 | 0.6939 | 0.5620 | 49 | 0.7204 | 0.7528 | 0.7363 | 178 | 0.8232 | 0.8371 | 0.8301 | 178 | 0.8571 | 0.8333 | 0.8451 | 108 | 0.7216 | 0.7651 | 0.7427 | 166 | 0.7293 | 0.7854 | 0.7563 | 247 | 0.8868 | 0.7015 | 0.7833 | 67 | 0.8255 | 0.6989 | 0.7569 | 176 | 0.7838 | 0.7938 | 0.7888 | 0.9552 |
111
+ | 0.0173 | 16.0 | 12704 | 0.3234 | 0.7685 | 0.9432 | 0.8469 | 88 | 0.8512 | 0.9108 | 0.88 | 157 | 0.8288 | 0.8462 | 0.8374 | 143 | 0.4474 | 0.6939 | 0.544 | 49 | 0.6915 | 0.7303 | 0.7104 | 178 | 0.8365 | 0.7472 | 0.7893 | 178 | 0.6596 | 0.8611 | 0.7470 | 108 | 0.6372 | 0.8675 | 0.7347 | 166 | 0.6823 | 0.8259 | 0.7473 | 247 | 0.7333 | 0.8209 | 0.7746 | 67 | 0.7513 | 0.8068 | 0.7781 | 176 | 0.7223 | 0.8234 | 0.7695 | 0.9532 |
112
+ | 0.0159 | 17.0 | 13498 | 0.3301 | 0.8652 | 0.875 | 0.8701 | 88 | 0.8480 | 0.9236 | 0.8841 | 157 | 0.8921 | 0.8671 | 0.8794 | 143 | 0.5522 | 0.7551 | 0.6379 | 49 | 0.7027 | 0.7303 | 0.7163 | 178 | 0.7989 | 0.8483 | 0.8229 | 178 | 0.7863 | 0.8519 | 0.8178 | 108 | 0.7711 | 0.7711 | 0.7711 | 166 | 0.6877 | 0.7935 | 0.7368 | 247 | 0.8116 | 0.8358 | 0.8235 | 67 | 0.7976 | 0.7614 | 0.7791 | 176 | 0.7720 | 0.8157 | 0.7933 | 0.9554 |
113
+ | 0.0156 | 18.0 | 14292 | 0.3390 | 0.8261 | 0.8636 | 0.8444 | 88 | 0.8412 | 0.9108 | 0.8746 | 157 | 0.8794 | 0.8671 | 0.8732 | 143 | 0.5968 | 0.7551 | 0.6667 | 49 | 0.6682 | 0.7921 | 0.7249 | 178 | 0.7967 | 0.8146 | 0.8056 | 178 | 0.9195 | 0.7407 | 0.8205 | 108 | 0.7321 | 0.7410 | 0.7365 | 166 | 0.7333 | 0.7571 | 0.7450 | 247 | 0.8197 | 0.7463 | 0.7813 | 67 | 0.7797 | 0.7841 | 0.7819 | 176 | 0.7746 | 0.7990 | 0.7866 | 0.9548 |
114
+ | 0.0125 | 19.0 | 15086 | 0.3517 | 0.9277 | 0.875 | 0.9006 | 88 | 0.8182 | 0.9172 | 0.8649 | 157 | 0.8993 | 0.8741 | 0.8865 | 143 | 0.5469 | 0.7143 | 0.6195 | 49 | 0.7249 | 0.7697 | 0.7466 | 178 | 0.8270 | 0.8596 | 0.8430 | 178 | 0.8624 | 0.8704 | 0.8664 | 108 | 0.7665 | 0.7711 | 0.7688 | 166 | 0.7567 | 0.8057 | 0.7804 | 247 | 0.8333 | 0.8209 | 0.8271 | 67 | 0.8061 | 0.7557 | 0.7801 | 176 | 0.7970 | 0.8221 | 0.8094 | 0.9572 |
115
+ | 0.0132 | 20.0 | 15880 | 0.3682 | 0.9241 | 0.8295 | 0.8743 | 88 | 0.8631 | 0.9236 | 0.8923 | 157 | 0.9030 | 0.8462 | 0.8736 | 143 | 0.55 | 0.6735 | 0.6055 | 49 | 0.6818 | 0.7584 | 0.7181 | 178 | 0.8488 | 0.8202 | 0.8343 | 178 | 0.8190 | 0.7963 | 0.8075 | 108 | 0.7081 | 0.7892 | 0.7464 | 166 | 0.7764 | 0.7449 | 0.7603 | 247 | 0.7160 | 0.8657 | 0.7838 | 67 | 0.8110 | 0.7557 | 0.7824 | 176 | 0.7865 | 0.7996 | 0.7930 | 0.9543 |
116
+ | 0.0112 | 21.0 | 16674 | 0.3974 | 0.8721 | 0.8523 | 0.8621 | 88 | 0.8249 | 0.9299 | 0.8743 | 157 | 0.8929 | 0.8741 | 0.8834 | 143 | 0.5205 | 0.7755 | 0.6230 | 49 | 0.6569 | 0.7528 | 0.7016 | 178 | 0.7677 | 0.8539 | 0.8085 | 178 | 0.8246 | 0.8704 | 0.8468 | 108 | 0.7326 | 0.7590 | 0.7456 | 166 | 0.7273 | 0.7773 | 0.7515 | 247 | 0.7746 | 0.8209 | 0.7971 | 67 | 0.7852 | 0.6648 | 0.72 | 176 | 0.7609 | 0.8054 | 0.7825 | 0.9513 |
117
+ | 0.0157 | 22.0 | 17468 | 0.3658 | 0.9390 | 0.875 | 0.9059 | 88 | 0.8412 | 0.9108 | 0.8746 | 157 | 0.9065 | 0.8811 | 0.8936 | 143 | 0.5075 | 0.6939 | 0.5862 | 49 | 0.6837 | 0.7528 | 0.7166 | 178 | 0.8415 | 0.8652 | 0.8532 | 178 | 0.875 | 0.7778 | 0.8235 | 108 | 0.6473 | 0.8072 | 0.7185 | 166 | 0.7540 | 0.7692 | 0.7615 | 247 | 0.8621 | 0.7463 | 0.8000 | 67 | 0.7949 | 0.7045 | 0.7470 | 176 | 0.7783 | 0.8028 | 0.7904 | 0.9525 |
118
+ | 0.0104 | 23.0 | 18262 | 0.3755 | 0.9302 | 0.9091 | 0.9195 | 88 | 0.8727 | 0.9172 | 0.8944 | 157 | 0.8477 | 0.8951 | 0.8707 | 143 | 0.5893 | 0.6735 | 0.6286 | 49 | 0.5947 | 0.7584 | 0.6667 | 178 | 0.7023 | 0.8483 | 0.7684 | 178 | 0.7787 | 0.8796 | 0.8261 | 108 | 0.7321 | 0.7410 | 0.7365 | 166 | 0.75 | 0.7409 | 0.7454 | 247 | 0.7714 | 0.8060 | 0.7883 | 67 | 0.8057 | 0.8011 | 0.8034 | 176 | 0.7546 | 0.8137 | 0.7831 | 0.9502 |
119
+ | 0.018 | 24.0 | 19056 | 0.3719 | 0.8571 | 0.8182 | 0.8372 | 88 | 0.8683 | 0.9236 | 0.8951 | 157 | 0.8690 | 0.8811 | 0.8750 | 143 | 0.5781 | 0.7551 | 0.6549 | 49 | 0.6604 | 0.7865 | 0.7179 | 178 | 0.7937 | 0.8427 | 0.8174 | 178 | 0.9310 | 0.75 | 0.8308 | 108 | 0.7363 | 0.8072 | 0.7701 | 166 | 0.7412 | 0.7652 | 0.7530 | 247 | 0.8596 | 0.7313 | 0.7903 | 67 | 0.7765 | 0.75 | 0.7630 | 176 | 0.7785 | 0.8060 | 0.7920 | 0.9553 |
120
+ | 0.0088 | 25.0 | 19850 | 0.3638 | 0.8876 | 0.8977 | 0.8927 | 88 | 0.8902 | 0.9299 | 0.9097 | 157 | 0.8301 | 0.8881 | 0.8581 | 143 | 0.6032 | 0.7755 | 0.6786 | 49 | 0.6853 | 0.7584 | 0.72 | 178 | 0.8683 | 0.8146 | 0.8406 | 178 | 0.9111 | 0.7593 | 0.8283 | 108 | 0.6952 | 0.7831 | 0.7365 | 166 | 0.74 | 0.7490 | 0.7445 | 247 | 0.7945 | 0.8657 | 0.8286 | 67 | 0.8068 | 0.8068 | 0.8068 | 176 | 0.7874 | 0.8137 | 0.8004 | 0.9561 |
121
+ | 0.009 | 26.0 | 20644 | 0.3683 | 0.9146 | 0.8523 | 0.8824 | 88 | 0.8229 | 0.9172 | 0.8675 | 157 | 0.9007 | 0.8881 | 0.8944 | 143 | 0.6607 | 0.7551 | 0.7048 | 49 | 0.7316 | 0.7809 | 0.7554 | 178 | 0.8441 | 0.8820 | 0.8626 | 178 | 0.8317 | 0.7778 | 0.8038 | 108 | 0.7310 | 0.7530 | 0.7418 | 166 | 0.7576 | 0.8097 | 0.7828 | 247 | 0.8286 | 0.8657 | 0.8467 | 67 | 0.7791 | 0.7614 | 0.7701 | 176 | 0.7960 | 0.8221 | 0.8088 | 0.9584 |
122
+ | 0.0105 | 27.0 | 21438 | 0.3624 | 0.8280 | 0.875 | 0.8508 | 88 | 0.8352 | 0.9363 | 0.8829 | 157 | 0.8592 | 0.8531 | 0.8561 | 143 | 0.4795 | 0.7143 | 0.5738 | 49 | 0.7158 | 0.7360 | 0.7258 | 178 | 0.8197 | 0.8427 | 0.8310 | 178 | 0.7068 | 0.8704 | 0.7801 | 108 | 0.6878 | 0.7831 | 0.7324 | 166 | 0.7741 | 0.7490 | 0.7613 | 247 | 0.8088 | 0.8209 | 0.8148 | 67 | 0.7644 | 0.7557 | 0.76 | 176 | 0.7616 | 0.8086 | 0.7844 | 0.9552 |
123
+ | 0.0088 | 28.0 | 22232 | 0.3755 | 0.7938 | 0.875 | 0.8324 | 88 | 0.8882 | 0.9108 | 0.8994 | 157 | 0.8705 | 0.8462 | 0.8582 | 143 | 0.6481 | 0.7143 | 0.6796 | 49 | 0.6618 | 0.7697 | 0.7117 | 178 | 0.8370 | 0.8652 | 0.8508 | 178 | 0.9277 | 0.7130 | 0.8063 | 108 | 0.7414 | 0.7771 | 0.7588 | 166 | 0.7603 | 0.8219 | 0.7899 | 247 | 0.94 | 0.7015 | 0.8034 | 67 | 0.7901 | 0.7273 | 0.7574 | 176 | 0.7928 | 0.8035 | 0.7981 | 0.9559 |
124
+ | 0.0101 | 29.0 | 23026 | 0.4108 | 0.8587 | 0.8977 | 0.8778 | 88 | 0.8765 | 0.9045 | 0.8903 | 157 | 0.8676 | 0.8252 | 0.8459 | 143 | 0.5286 | 0.7551 | 0.6218 | 49 | 0.7005 | 0.7360 | 0.7178 | 178 | 0.8162 | 0.8483 | 0.8320 | 178 | 0.8646 | 0.7685 | 0.8137 | 108 | 0.7225 | 0.7530 | 0.7375 | 166 | 0.7236 | 0.8057 | 0.7625 | 247 | 0.9423 | 0.7313 | 0.8235 | 67 | 0.7870 | 0.7557 | 0.7710 | 176 | 0.7808 | 0.8009 | 0.7907 | 0.9526 |
125
+ | 0.0087 | 30.0 | 23820 | 0.3898 | 0.8764 | 0.8864 | 0.8814 | 88 | 0.9114 | 0.9172 | 0.9143 | 157 | 0.9015 | 0.8322 | 0.8655 | 143 | 0.5333 | 0.6531 | 0.5872 | 49 | 0.6502 | 0.7416 | 0.6929 | 178 | 0.8101 | 0.8146 | 0.8123 | 178 | 0.9529 | 0.75 | 0.8394 | 108 | 0.7922 | 0.7349 | 0.7625 | 166 | 0.7635 | 0.7449 | 0.7541 | 247 | 0.8947 | 0.7612 | 0.8226 | 67 | 0.7702 | 0.7045 | 0.7359 | 176 | 0.7979 | 0.7784 | 0.7880 | 0.9533 |
126
+
127
+
128
+ ### Framework versions
129
+
130
+ - Transformers 4.22.0.dev0
131
+ - Pytorch 1.12.0+cu113
132
+ - Datasets 2.2.2
133
+ - Tokenizers 0.12.1