lombardata commited on
Commit
06a7742
1 Parent(s): 3e91cc7

Model save

Browse files
Files changed (2) hide show
  1. README.md +124 -84
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -1,11 +1,7 @@
1
  ---
2
- language:
3
- - eng
4
  license: apache-2.0
5
  base_model: facebook/dinov2-large
6
  tags:
7
- - multilabel-image-classification
8
- - multilabel
9
  - generated_from_trainer
10
  metrics:
11
  - accuracy
@@ -19,14 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  # dinov2-large-2024_01_05-kornia_img-size518_batch-size32_epochs70_freeze
21
 
22
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the multilabel_complete_dataset dataset.
23
  It achieves the following results on the evaluation set:
24
- - Loss: 0.0840
25
- - F1 Micro: 0.8543
26
- - F1 Macro: 0.7343
27
- - Roc Auc: 0.9077
28
- - Accuracy: 0.5606
29
- - Learning Rate: 0.0001
30
 
31
  ## Model description
32
 
@@ -51,82 +47,126 @@ The following hyperparameters were used during training:
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
- - num_epochs: 70
55
 
56
  ### Training results
57
 
58
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
59
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
60
- | No log | 1.0 | 274 | 0.1358 | 0.7376 | 0.5756 | 0.8276 | 0.4456 | 0.01 |
61
- | 0.1895 | 2.0 | 548 | 0.1422 | 0.7463 | 0.6131 | 0.8433 | 0.4358 | 0.01 |
62
- | 0.1895 | 3.0 | 822 | 0.2134 | 0.7273 | 0.5242 | 0.8305 | 0.3842 | 0.01 |
63
- | 0.1668 | 4.0 | 1096 | 0.1450 | 0.7034 | 0.5474 | 0.7947 | 0.4438 | 0.01 |
64
- | 0.1668 | 5.0 | 1370 | 0.1329 | 0.7611 | 0.6195 | 0.8536 | 0.4438 | 0.01 |
65
- | 0.1666 | 6.0 | 1644 | 0.1324 | 0.7528 | 0.5625 | 0.8411 | 0.4445 | 0.01 |
66
- | 0.1666 | 7.0 | 1918 | 0.1345 | 0.7496 | 0.5690 | 0.8390 | 0.4313 | 0.01 |
67
- | 0.1664 | 8.0 | 2192 | 0.1381 | 0.7502 | 0.5628 | 0.8397 | 0.4323 | 0.01 |
68
- | 0.1664 | 9.0 | 2466 | 0.1369 | 0.7396 | 0.5492 | 0.8220 | 0.4403 | 0.01 |
69
- | 0.1656 | 10.0 | 2740 | 0.1361 | 0.7327 | 0.5282 | 0.8212 | 0.4424 | 0.01 |
70
- | 0.166 | 11.0 | 3014 | 0.1381 | 0.7434 | 0.5428 | 0.8371 | 0.4278 | 0.01 |
71
- | 0.166 | 12.0 | 3288 | 0.1345 | 0.7355 | 0.5619 | 0.8279 | 0.4449 | 0.01 |
72
- | 0.1585 | 13.0 | 3562 | 0.1155 | 0.8009 | 0.6501 | 0.8746 | 0.4902 | 0.001 |
73
- | 0.1585 | 14.0 | 3836 | 0.1116 | 0.8079 | 0.6697 | 0.8751 | 0.5042 | 0.001 |
74
- | 0.133 | 15.0 | 4110 | 0.1073 | 0.8149 | 0.6736 | 0.8772 | 0.5181 | 0.001 |
75
- | 0.133 | 16.0 | 4384 | 0.1048 | 0.8238 | 0.7056 | 0.8975 | 0.5084 | 0.001 |
76
- | 0.1289 | 17.0 | 4658 | 0.1025 | 0.8209 | 0.6896 | 0.8839 | 0.5244 | 0.001 |
77
- | 0.1289 | 18.0 | 4932 | 0.1026 | 0.8290 | 0.7045 | 0.8916 | 0.5321 | 0.001 |
78
- | 0.1227 | 19.0 | 5206 | 0.1012 | 0.8306 | 0.6905 | 0.8941 | 0.5279 | 0.001 |
79
- | 0.1227 | 20.0 | 5480 | 0.0997 | 0.8280 | 0.6831 | 0.8930 | 0.5216 | 0.001 |
80
- | 0.1202 | 21.0 | 5754 | 0.0989 | 0.8300 | 0.6927 | 0.8896 | 0.5352 | 0.001 |
81
- | 0.12 | 22.0 | 6028 | 0.0996 | 0.8280 | 0.6961 | 0.8893 | 0.5209 | 0.001 |
82
- | 0.12 | 23.0 | 6302 | 0.0972 | 0.8319 | 0.6959 | 0.8956 | 0.5195 | 0.001 |
83
- | 0.1179 | 24.0 | 6576 | 0.1008 | 0.8271 | 0.6881 | 0.8916 | 0.5213 | 0.001 |
84
- | 0.1179 | 25.0 | 6850 | 0.0983 | 0.8283 | 0.6860 | 0.8863 | 0.5269 | 0.001 |
85
- | 0.1166 | 26.0 | 7124 | 0.0985 | 0.8284 | 0.6806 | 0.8876 | 0.5311 | 0.001 |
86
- | 0.1166 | 27.0 | 7398 | 0.0957 | 0.8305 | 0.6901 | 0.8876 | 0.5324 | 0.001 |
87
- | 0.1158 | 28.0 | 7672 | 0.0995 | 0.8292 | 0.7054 | 0.8935 | 0.5178 | 0.001 |
88
- | 0.1158 | 29.0 | 7946 | 0.0933 | 0.8364 | 0.7026 | 0.8971 | 0.5335 | 0.001 |
89
- | 0.114 | 30.0 | 8220 | 0.0947 | 0.8351 | 0.7110 | 0.9019 | 0.5258 | 0.001 |
90
- | 0.114 | 31.0 | 8494 | 0.0967 | 0.8365 | 0.7175 | 0.9046 | 0.5331 | 0.001 |
91
- | 0.1134 | 32.0 | 8768 | 0.0949 | 0.8354 | 0.6933 | 0.8948 | 0.5324 | 0.001 |
92
- | 0.113 | 33.0 | 9042 | 0.0951 | 0.8367 | 0.6973 | 0.8967 | 0.5363 | 0.001 |
93
- | 0.113 | 34.0 | 9316 | 0.0936 | 0.8335 | 0.6878 | 0.8876 | 0.5380 | 0.001 |
94
- | 0.1124 | 35.0 | 9590 | 0.0936 | 0.8340 | 0.6856 | 0.8944 | 0.5311 | 0.001 |
95
- | 0.1124 | 36.0 | 9864 | 0.0934 | 0.8456 | 0.7298 | 0.9031 | 0.5454 | 0.0001 |
96
- | 0.1083 | 37.0 | 10138 | 0.0924 | 0.8457 | 0.7189 | 0.8999 | 0.5468 | 0.0001 |
97
- | 0.1083 | 38.0 | 10412 | 0.0915 | 0.8449 | 0.7089 | 0.9004 | 0.5450 | 0.0001 |
98
- | 0.1034 | 39.0 | 10686 | 0.0902 | 0.8488 | 0.7252 | 0.9078 | 0.5485 | 0.0001 |
99
- | 0.1034 | 40.0 | 10960 | 0.0906 | 0.8459 | 0.7182 | 0.9011 | 0.5495 | 0.0001 |
100
- | 0.1024 | 41.0 | 11234 | 0.0894 | 0.8481 | 0.7130 | 0.9020 | 0.5506 | 0.0001 |
101
- | 0.1004 | 42.0 | 11508 | 0.0873 | 0.8457 | 0.7148 | 0.8977 | 0.5520 | 0.0001 |
102
- | 0.1004 | 43.0 | 11782 | 0.0870 | 0.8495 | 0.7182 | 0.9062 | 0.5537 | 0.0001 |
103
- | 0.0998 | 44.0 | 12056 | 0.0868 | 0.8486 | 0.7261 | 0.9033 | 0.5499 | 0.0001 |
104
- | 0.0998 | 45.0 | 12330 | 0.0868 | 0.8493 | 0.7236 | 0.9053 | 0.5551 | 0.0001 |
105
- | 0.0975 | 46.0 | 12604 | 0.0865 | 0.8490 | 0.7318 | 0.9072 | 0.5513 | 0.0001 |
106
- | 0.0975 | 47.0 | 12878 | 0.0860 | 0.8512 | 0.7390 | 0.9088 | 0.5548 | 0.0001 |
107
- | 0.099 | 48.0 | 13152 | 0.0860 | 0.8510 | 0.7360 | 0.9055 | 0.5558 | 0.0001 |
108
- | 0.099 | 49.0 | 13426 | 0.0858 | 0.8500 | 0.7362 | 0.9058 | 0.5548 | 0.0001 |
109
- | 0.0972 | 50.0 | 13700 | 0.0856 | 0.8505 | 0.7257 | 0.9033 | 0.5586 | 0.0001 |
110
- | 0.0972 | 51.0 | 13974 | 0.0856 | 0.8500 | 0.7409 | 0.9038 | 0.5579 | 0.0001 |
111
- | 0.0957 | 52.0 | 14248 | 0.0859 | 0.8508 | 0.7232 | 0.9035 | 0.5569 | 0.0001 |
112
- | 0.0964 | 53.0 | 14522 | 0.0849 | 0.8521 | 0.7276 | 0.9058 | 0.5628 | 0.0001 |
113
- | 0.0964 | 54.0 | 14796 | 0.0852 | 0.8539 | 0.7395 | 0.9116 | 0.5537 | 0.0001 |
114
- | 0.0955 | 55.0 | 15070 | 0.0851 | 0.8511 | 0.7354 | 0.9041 | 0.5565 | 0.0001 |
115
- | 0.0955 | 56.0 | 15344 | 0.0849 | 0.8529 | 0.7367 | 0.9067 | 0.5572 | 0.0001 |
116
- | 0.095 | 57.0 | 15618 | 0.0848 | 0.8494 | 0.7242 | 0.8994 | 0.5537 | 0.0001 |
117
- | 0.095 | 58.0 | 15892 | 0.0845 | 0.8512 | 0.7363 | 0.9029 | 0.5593 | 0.0001 |
118
- | 0.093 | 59.0 | 16166 | 0.0840 | 0.8531 | 0.7390 | 0.9058 | 0.5607 | 0.0001 |
119
- | 0.093 | 60.0 | 16440 | 0.0847 | 0.8528 | 0.7473 | 0.9116 | 0.5562 | 0.0001 |
120
- | 0.0936 | 61.0 | 16714 | 0.0843 | 0.8517 | 0.7425 | 0.9078 | 0.5523 | 0.0001 |
121
- | 0.0936 | 62.0 | 16988 | 0.0844 | 0.8515 | 0.7456 | 0.9053 | 0.5541 | 0.0001 |
122
- | 0.0932 | 63.0 | 17262 | 0.0840 | 0.8535 | 0.7344 | 0.9062 | 0.5576 | 0.0001 |
123
- | 0.0933 | 64.0 | 17536 | 0.0840 | 0.8543 | 0.7405 | 0.9072 | 0.5614 | 0.0001 |
124
- | 0.0933 | 65.0 | 17810 | 0.0840 | 0.8507 | 0.7354 | 0.9016 | 0.5579 | 0.0001 |
125
- | 0.0921 | 66.0 | 18084 | 0.0841 | 0.8529 | 0.7297 | 0.9066 | 0.5569 | 0.0001 |
126
- | 0.0921 | 67.0 | 18358 | 0.0838 | 0.8540 | 0.7393 | 0.9100 | 0.5541 | 0.0001 |
127
- | 0.0913 | 68.0 | 18632 | 0.0836 | 0.8541 | 0.7403 | 0.9090 | 0.5572 | 0.0001 |
128
- | 0.0913 | 69.0 | 18906 | 0.0835 | 0.8548 | 0.7494 | 0.9100 | 0.5583 | 0.0001 |
129
- | 0.0911 | 70.0 | 19180 | 0.0831 | 0.8552 | 0.7487 | 0.9104 | 0.5562 | 0.0001 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
130
 
131
 
132
  ### Framework versions
 
1
  ---
 
 
2
  license: apache-2.0
3
  base_model: facebook/dinov2-large
4
  tags:
 
 
5
  - generated_from_trainer
6
  metrics:
7
  - accuracy
 
15
 
16
  # dinov2-large-2024_01_05-kornia_img-size518_batch-size32_epochs70_freeze
17
 
18
+ This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.0819
21
+ - F1 Micro: 0.8564
22
+ - F1 Macro: 0.7560
23
+ - Roc Auc: 0.9061
24
+ - Accuracy: 0.5656
25
+ - Learning Rate: 0.0000
26
 
27
  ## Model description
28
 
 
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 140
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Accuracy | F1 Macro | F1 Micro | Validation Loss | Roc Auc | Rate |
55
+ |:-------------:|:-----:|:-----:|:--------:|:--------:|:--------:|:---------------:|:-------:|:------:|
56
+ | No log | 1.0 | 274 | 0.4456 | 0.5756 | 0.7376 | 0.1358 | 0.8276 | 0.01 |
57
+ | 0.1895 | 2.0 | 548 | 0.4358 | 0.6131 | 0.7463 | 0.1422 | 0.8433 | 0.01 |
58
+ | 0.1895 | 3.0 | 822 | 0.3842 | 0.5242 | 0.7273 | 0.2134 | 0.8305 | 0.01 |
59
+ | 0.1668 | 4.0 | 1096 | 0.4438 | 0.5474 | 0.7034 | 0.1450 | 0.7947 | 0.01 |
60
+ | 0.1668 | 5.0 | 1370 | 0.4438 | 0.6195 | 0.7611 | 0.1329 | 0.8536 | 0.01 |
61
+ | 0.1666 | 6.0 | 1644 | 0.4445 | 0.5625 | 0.7528 | 0.1324 | 0.8411 | 0.01 |
62
+ | 0.1666 | 7.0 | 1918 | 0.4313 | 0.5690 | 0.7496 | 0.1345 | 0.8390 | 0.01 |
63
+ | 0.1664 | 8.0 | 2192 | 0.4323 | 0.5628 | 0.7502 | 0.1381 | 0.8397 | 0.01 |
64
+ | 0.1664 | 9.0 | 2466 | 0.4403 | 0.5492 | 0.7396 | 0.1369 | 0.8220 | 0.01 |
65
+ | 0.1656 | 10.0 | 2740 | 0.4424 | 0.5282 | 0.7327 | 0.1361 | 0.8212 | 0.01 |
66
+ | 0.166 | 11.0 | 3014 | 0.4278 | 0.5428 | 0.7434 | 0.1381 | 0.8371 | 0.01 |
67
+ | 0.166 | 12.0 | 3288 | 0.4449 | 0.5619 | 0.7355 | 0.1345 | 0.8279 | 0.01 |
68
+ | 0.1585 | 13.0 | 3562 | 0.4902 | 0.6501 | 0.8009 | 0.1155 | 0.8746 | 0.001 |
69
+ | 0.1585 | 14.0 | 3836 | 0.5042 | 0.6697 | 0.8079 | 0.1116 | 0.8751 | 0.001 |
70
+ | 0.133 | 15.0 | 4110 | 0.5181 | 0.6736 | 0.8149 | 0.1073 | 0.8772 | 0.001 |
71
+ | 0.133 | 16.0 | 4384 | 0.5084 | 0.7056 | 0.8238 | 0.1048 | 0.8975 | 0.001 |
72
+ | 0.1289 | 17.0 | 4658 | 0.5244 | 0.6896 | 0.8209 | 0.1025 | 0.8839 | 0.001 |
73
+ | 0.1289 | 18.0 | 4932 | 0.5321 | 0.7045 | 0.8290 | 0.1026 | 0.8916 | 0.001 |
74
+ | 0.1227 | 19.0 | 5206 | 0.5279 | 0.6905 | 0.8306 | 0.1012 | 0.8941 | 0.001 |
75
+ | 0.1227 | 20.0 | 5480 | 0.5216 | 0.6831 | 0.8280 | 0.0997 | 0.8930 | 0.001 |
76
+ | 0.1202 | 21.0 | 5754 | 0.5352 | 0.6927 | 0.8300 | 0.0989 | 0.8896 | 0.001 |
77
+ | 0.12 | 22.0 | 6028 | 0.5209 | 0.6961 | 0.8280 | 0.0996 | 0.8893 | 0.001 |
78
+ | 0.12 | 23.0 | 6302 | 0.5195 | 0.6959 | 0.8319 | 0.0972 | 0.8956 | 0.001 |
79
+ | 0.1179 | 24.0 | 6576 | 0.5213 | 0.6881 | 0.8271 | 0.1008 | 0.8916 | 0.001 |
80
+ | 0.1179 | 25.0 | 6850 | 0.5269 | 0.6860 | 0.8283 | 0.0983 | 0.8863 | 0.001 |
81
+ | 0.1166 | 26.0 | 7124 | 0.5311 | 0.6806 | 0.8284 | 0.0985 | 0.8876 | 0.001 |
82
+ | 0.1166 | 27.0 | 7398 | 0.5324 | 0.6901 | 0.8305 | 0.0957 | 0.8876 | 0.001 |
83
+ | 0.1158 | 28.0 | 7672 | 0.5178 | 0.7054 | 0.8292 | 0.0995 | 0.8935 | 0.001 |
84
+ | 0.1158 | 29.0 | 7946 | 0.5335 | 0.7026 | 0.8364 | 0.0933 | 0.8971 | 0.001 |
85
+ | 0.114 | 30.0 | 8220 | 0.5258 | 0.7110 | 0.8351 | 0.0947 | 0.9019 | 0.001 |
86
+ | 0.114 | 31.0 | 8494 | 0.5331 | 0.7175 | 0.8365 | 0.0967 | 0.9046 | 0.001 |
87
+ | 0.1134 | 32.0 | 8768 | 0.5324 | 0.6933 | 0.8354 | 0.0949 | 0.8948 | 0.001 |
88
+ | 0.113 | 33.0 | 9042 | 0.5363 | 0.6973 | 0.8367 | 0.0951 | 0.8967 | 0.001 |
89
+ | 0.113 | 34.0 | 9316 | 0.5380 | 0.6878 | 0.8335 | 0.0936 | 0.8876 | 0.001 |
90
+ | 0.1124 | 35.0 | 9590 | 0.5311 | 0.6856 | 0.8340 | 0.0936 | 0.8944 | 0.001 |
91
+ | 0.1124 | 36.0 | 9864 | 0.5454 | 0.7298 | 0.8456 | 0.0934 | 0.9031 | 0.0001 |
92
+ | 0.1083 | 37.0 | 10138 | 0.5468 | 0.7189 | 0.8457 | 0.0924 | 0.8999 | 0.0001 |
93
+ | 0.1083 | 38.0 | 10412 | 0.5450 | 0.7089 | 0.8449 | 0.0915 | 0.9004 | 0.0001 |
94
+ | 0.1034 | 39.0 | 10686 | 0.5485 | 0.7252 | 0.8488 | 0.0902 | 0.9078 | 0.0001 |
95
+ | 0.1034 | 40.0 | 10960 | 0.5495 | 0.7182 | 0.8459 | 0.0906 | 0.9011 | 0.0001 |
96
+ | 0.1024 | 41.0 | 11234 | 0.5506 | 0.7130 | 0.8481 | 0.0894 | 0.9020 | 0.0001 |
97
+ | 0.1004 | 42.0 | 11508 | 0.5520 | 0.7148 | 0.8457 | 0.0873 | 0.8977 | 0.0001 |
98
+ | 0.1004 | 43.0 | 11782 | 0.5537 | 0.7182 | 0.8495 | 0.0870 | 0.9062 | 0.0001 |
99
+ | 0.0998 | 44.0 | 12056 | 0.5499 | 0.7261 | 0.8486 | 0.0868 | 0.9033 | 0.0001 |
100
+ | 0.0998 | 45.0 | 12330 | 0.5551 | 0.7236 | 0.8493 | 0.0868 | 0.9053 | 0.0001 |
101
+ | 0.0975 | 46.0 | 12604 | 0.5513 | 0.7318 | 0.8490 | 0.0865 | 0.9072 | 0.0001 |
102
+ | 0.0975 | 47.0 | 12878 | 0.5548 | 0.7390 | 0.8512 | 0.0860 | 0.9088 | 0.0001 |
103
+ | 0.099 | 48.0 | 13152 | 0.5558 | 0.7360 | 0.8510 | 0.0860 | 0.9055 | 0.0001 |
104
+ | 0.099 | 49.0 | 13426 | 0.5548 | 0.7362 | 0.8500 | 0.0858 | 0.9058 | 0.0001 |
105
+ | 0.0972 | 50.0 | 13700 | 0.5586 | 0.7257 | 0.8505 | 0.0856 | 0.9033 | 0.0001 |
106
+ | 0.0972 | 51.0 | 13974 | 0.5579 | 0.7409 | 0.8500 | 0.0856 | 0.9038 | 0.0001 |
107
+ | 0.0957 | 52.0 | 14248 | 0.5569 | 0.7232 | 0.8508 | 0.0859 | 0.9035 | 0.0001 |
108
+ | 0.0964 | 53.0 | 14522 | 0.5628 | 0.7276 | 0.8521 | 0.0849 | 0.9058 | 0.0001 |
109
+ | 0.0964 | 54.0 | 14796 | 0.5537 | 0.7395 | 0.8539 | 0.0852 | 0.9116 | 0.0001 |
110
+ | 0.0955 | 55.0 | 15070 | 0.5565 | 0.7354 | 0.8511 | 0.0851 | 0.9041 | 0.0001 |
111
+ | 0.0955 | 56.0 | 15344 | 0.5572 | 0.7367 | 0.8529 | 0.0849 | 0.9067 | 0.0001 |
112
+ | 0.095 | 57.0 | 15618 | 0.5537 | 0.7242 | 0.8494 | 0.0848 | 0.8994 | 0.0001 |
113
+ | 0.095 | 58.0 | 15892 | 0.5593 | 0.7363 | 0.8512 | 0.0845 | 0.9029 | 0.0001 |
114
+ | 0.093 | 59.0 | 16166 | 0.5607 | 0.7390 | 0.8531 | 0.0840 | 0.9058 | 0.0001 |
115
+ | 0.093 | 60.0 | 16440 | 0.5562 | 0.7473 | 0.8528 | 0.0847 | 0.9116 | 0.0001 |
116
+ | 0.0936 | 61.0 | 16714 | 0.5523 | 0.7425 | 0.8517 | 0.0843 | 0.9078 | 0.0001 |
117
+ | 0.0936 | 62.0 | 16988 | 0.5541 | 0.7456 | 0.8515 | 0.0844 | 0.9053 | 0.0001 |
118
+ | 0.0932 | 63.0 | 17262 | 0.5576 | 0.7344 | 0.8535 | 0.0840 | 0.9062 | 0.0001 |
119
+ | 0.0933 | 64.0 | 17536 | 0.5614 | 0.7405 | 0.8543 | 0.0840 | 0.9072 | 0.0001 |
120
+ | 0.0933 | 65.0 | 17810 | 0.5579 | 0.7354 | 0.8507 | 0.0840 | 0.9016 | 0.0001 |
121
+ | 0.0921 | 66.0 | 18084 | 0.5569 | 0.7297 | 0.8529 | 0.0841 | 0.9066 | 0.0001 |
122
+ | 0.0921 | 67.0 | 18358 | 0.5541 | 0.7393 | 0.8540 | 0.0838 | 0.9100 | 0.0001 |
123
+ | 0.0913 | 68.0 | 18632 | 0.5572 | 0.7403 | 0.8541 | 0.0836 | 0.9090 | 0.0001 |
124
+ | 0.0913 | 69.0 | 18906 | 0.5583 | 0.7494 | 0.8548 | 0.0835 | 0.9100 | 0.0001 |
125
+ | 0.0911 | 70.0 | 19180 | 0.5562 | 0.7487 | 0.8552 | 0.0831 | 0.9104 | 0.0001 |
126
+ | 0.0911 | 71.0 | 19454 | 0.0835 | 0.8557 | 0.7484 | 0.9102 | 0.5579 | 0.0001 |
127
+ | 0.0907 | 72.0 | 19728 | 0.0832 | 0.8532 | 0.7446 | 0.9037 | 0.5611 | 0.0001 |
128
+ | 0.0905 | 73.0 | 20002 | 0.0827 | 0.8558 | 0.7512 | 0.9105 | 0.5576 | 0.0001 |
129
+ | 0.0905 | 74.0 | 20276 | 0.0835 | 0.8548 | 0.7519 | 0.9090 | 0.5590 | 0.0001 |
130
+ | 0.0896 | 75.0 | 20550 | 0.0829 | 0.8535 | 0.7428 | 0.9053 | 0.5565 | 0.0001 |
131
+ | 0.0896 | 76.0 | 20824 | 0.0828 | 0.8561 | 0.7449 | 0.9091 | 0.5642 | 0.0001 |
132
+ | 0.089 | 77.0 | 21098 | 0.0827 | 0.8568 | 0.7507 | 0.9102 | 0.5604 | 0.0001 |
133
+ | 0.089 | 78.0 | 21372 | 0.0833 | 0.8529 | 0.7436 | 0.9067 | 0.5579 | 0.0001 |
134
+ | 0.0892 | 79.0 | 21646 | 0.0830 | 0.8540 | 0.7502 | 0.9055 | 0.5590 | 0.0001 |
135
+ | 0.0892 | 80.0 | 21920 | 0.0827 | 0.8548 | 0.7461 | 0.9049 | 0.5600 | 1e-05 |
136
+ | 0.0879 | 81.0 | 22194 | 0.0823 | 0.8576 | 0.7543 | 0.9116 | 0.5607 | 1e-05 |
137
+ | 0.0879 | 82.0 | 22468 | 0.0822 | 0.8576 | 0.7536 | 0.9112 | 0.5632 | 1e-05 |
138
+ | 0.0867 | 83.0 | 22742 | 0.0822 | 0.8554 | 0.7520 | 0.9058 | 0.5625 | 1e-05 |
139
+ | 0.0864 | 84.0 | 23016 | 0.0821 | 0.8551 | 0.7511 | 0.9072 | 0.5639 | 1e-05 |
140
+ | 0.0864 | 85.0 | 23290 | 0.0820 | 0.8560 | 0.7533 | 0.9067 | 0.5618 | 1e-05 |
141
+ | 0.0865 | 86.0 | 23564 | 0.0821 | 0.8553 | 0.7496 | 0.9060 | 0.5600 | 1e-05 |
142
+ | 0.0865 | 87.0 | 23838 | 0.0817 | 0.8559 | 0.7519 | 0.9081 | 0.5586 | 1e-05 |
143
+ | 0.0868 | 88.0 | 24112 | 0.0817 | 0.8558 | 0.7526 | 0.9082 | 0.5621 | 1e-05 |
144
+ | 0.0868 | 89.0 | 24386 | 0.0818 | 0.8570 | 0.7536 | 0.9083 | 0.5639 | 1e-05 |
145
+ | 0.0857 | 90.0 | 24660 | 0.0818 | 0.8558 | 0.7522 | 0.9081 | 0.5618 | 1e-05 |
146
+ | 0.0857 | 91.0 | 24934 | 0.0818 | 0.8569 | 0.7496 | 0.9081 | 0.5632 | 1e-05 |
147
+ | 0.0862 | 92.0 | 25208 | 0.0821 | 0.8566 | 0.7552 | 0.9093 | 0.5649 | 1e-05 |
148
+ | 0.0862 | 93.0 | 25482 | 0.0815 | 0.8589 | 0.7580 | 0.9130 | 0.5628 | 1e-05 |
149
+ | 0.0851 | 94.0 | 25756 | 0.0816 | 0.8571 | 0.7566 | 0.9117 | 0.5600 | 1e-05 |
150
+ | 0.0854 | 95.0 | 26030 | 0.0815 | 0.8564 | 0.7553 | 0.9100 | 0.5632 | 1e-05 |
151
+ | 0.0854 | 96.0 | 26304 | 0.0815 | 0.8576 | 0.7585 | 0.9124 | 0.5621 | 1e-05 |
152
+ | 0.0854 | 97.0 | 26578 | 0.0817 | 0.8576 | 0.7579 | 0.9107 | 0.5628 | 1e-05 |
153
+ | 0.0854 | 98.0 | 26852 | 0.0816 | 0.8571 | 0.7527 | 0.9100 | 0.5639 | 1e-05 |
154
+ | 0.0855 | 99.0 | 27126 | 0.0818 | 0.8578 | 0.7556 | 0.9086 | 0.5642 | 1e-05 |
155
+ | 0.0855 | 100.0 | 27400 | 0.0816 | 0.8571 | 0.7533 | 0.9080 | 0.5632 | 0.0000 |
156
+ | 0.0837 | 101.0 | 27674 | 0.0814 | 0.8575 | 0.7553 | 0.9093 | 0.5645 | 0.0000 |
157
+ | 0.0837 | 102.0 | 27948 | 0.0814 | 0.8572 | 0.7559 | 0.9099 | 0.5652 | 0.0000 |
158
+ | 0.085 | 103.0 | 28222 | 0.0816 | 0.8570 | 0.7566 | 0.9085 | 0.5645 | 0.0000 |
159
+ | 0.085 | 104.0 | 28496 | 0.0812 | 0.8576 | 0.7573 | 0.9102 | 0.5645 | 0.0000 |
160
+ | 0.0844 | 105.0 | 28770 | 0.0817 | 0.8572 | 0.7589 | 0.9124 | 0.5604 | 0.0000 |
161
+ | 0.0845 | 106.0 | 29044 | 0.0814 | 0.8563 | 0.7514 | 0.9079 | 0.5628 | 0.0000 |
162
+ | 0.0845 | 107.0 | 29318 | 0.0817 | 0.8558 | 0.7490 | 0.9058 | 0.5635 | 0.0000 |
163
+ | 0.0854 | 108.0 | 29592 | 0.0816 | 0.8569 | 0.7569 | 0.9094 | 0.5642 | 0.0000 |
164
+ | 0.0854 | 109.0 | 29866 | 0.0814 | 0.8574 | 0.7558 | 0.9107 | 0.5652 | 0.0000 |
165
+ | 0.0854 | 110.0 | 30140 | 0.0813 | 0.8578 | 0.7565 | 0.9118 | 0.5639 | 0.0000 |
166
+ | 0.0854 | 111.0 | 30414 | 0.0814 | 0.8576 | 0.7579 | 0.9115 | 0.5639 | 0.0000 |
167
+ | 0.0851 | 112.0 | 30688 | 0.0817 | 0.8581 | 0.7576 | 0.9108 | 0.5632 | 0.0000 |
168
+ | 0.0851 | 113.0 | 30962 | 0.0815 | 0.8583 | 0.7563 | 0.9128 | 0.5614 | 0.0000 |
169
+ | 0.0848 | 114.0 | 31236 | 0.0819 | 0.8564 | 0.7560 | 0.9061 | 0.5656 | 0.0000 |
170
 
171
 
172
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a54d04b918e7614769e37ce667508168a49e9547847f42bd7e5dee936aa8d54a
3
  size 1228201126
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a8704818f2ff8107823cc3ce0f095e82470dff944254bdaabfd01184ea69269d
3
  size 1228201126