Edit model card

SwinV2-Base-Document-Classifier

This model is a fine-tuned version of microsoft/swinv2-base-patch4-window16-256 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0187
  • Accuracy: 0.9964
  • F1: 0.9964
  • Precision: 0.9964
  • Recall: 0.9964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.4373 0.005 50 0.0894 0.9683 0.9683 0.9698 0.9683
0.1053 0.01 100 0.0463 0.9846 0.9846 0.9850 0.9846
0.0675 0.015 150 0.0604 0.9836 0.9836 0.9839 0.9836
0.0722 0.02 200 0.0336 0.9906 0.9906 0.9906 0.9906
0.079 0.025 250 0.0337 0.9908 0.9908 0.9908 0.9908
0.0707 0.03 300 0.0402 0.9886 0.9886 0.9887 0.9886
0.0565 0.035 350 0.0369 0.9918 0.9918 0.9918 0.9918
0.0406 0.04 400 0.0392 0.9914 0.9914 0.9916 0.9914
0.0436 0.045 450 0.0298 0.9936 0.9936 0.9936 0.9936
0.0288 0.05 500 0.0310 0.9952 0.9952 0.9952 0.9952
0.0507 0.055 550 0.0437 0.9908 0.9908 0.9909 0.9908
0.0447 0.06 600 0.0299 0.9952 0.9952 0.9952 0.9952
0.0414 0.065 650 0.0338 0.9940 0.9940 0.9940 0.994
0.0275 0.07 700 0.0338 0.9934 0.9934 0.9934 0.9934
0.0384 0.075 750 0.0339 0.9942 0.9942 0.9942 0.9942
0.0319 0.08 800 0.0301 0.9950 0.9950 0.9950 0.9950
0.0637 0.085 850 0.0407 0.9934 0.9934 0.9935 0.9934
0.0333 0.09 900 0.0282 0.9950 0.9950 0.9950 0.9950
0.0436 0.095 950 0.0261 0.9948 0.9948 0.9948 0.9948
0.0295 0.1 1000 0.0475 0.9918 0.9918 0.9920 0.9918
0.06 0.105 1050 0.0516 0.9908 0.9908 0.9910 0.9908
0.0553 0.11 1100 0.0234 0.9950 0.9950 0.9950 0.9950
0.0492 0.115 1150 0.0371 0.9924 0.9924 0.9924 0.9924
0.0374 0.12 1200 0.0268 0.9938 0.9938 0.9939 0.9938
0.0477 0.125 1250 0.0227 0.9944 0.9944 0.9944 0.9944
0.0399 0.13 1300 0.0272 0.9942 0.9942 0.9942 0.9942
0.0357 0.135 1350 0.0233 0.9952 0.9952 0.9952 0.9952
0.0109 0.14 1400 0.0253 0.9952 0.9952 0.9952 0.9952
0.0384 0.145 1450 0.0271 0.9940 0.9940 0.9940 0.9940
0.031 0.15 1500 0.0308 0.9940 0.9940 0.9940 0.994
0.0349 0.155 1550 0.0295 0.9950 0.9950 0.9950 0.9950
0.0457 0.16 1600 0.0325 0.9938 0.9938 0.9939 0.9938
0.0439 0.165 1650 0.0374 0.9918 0.9918 0.9919 0.9918
0.0351 0.17 1700 0.0304 0.9936 0.9936 0.9936 0.9936
0.0365 0.175 1750 0.0284 0.9940 0.9940 0.9940 0.9940
0.0401 0.18 1800 0.0285 0.9940 0.9940 0.9940 0.994
0.0356 0.185 1850 0.0234 0.9950 0.9950 0.9950 0.9950
0.0157 0.19 1900 0.0373 0.9934 0.9934 0.9936 0.9934
0.0304 0.195 1950 0.0216 0.9954 0.9954 0.9954 0.9954
0.0259 0.2 2000 0.0281 0.9944 0.9944 0.9944 0.9944
0.0344 0.205 2050 0.0272 0.9950 0.9950 0.9950 0.9950
0.0393 0.21 2100 0.0234 0.9958 0.9958 0.9958 0.9958
0.0165 0.215 2150 0.0251 0.9952 0.9952 0.9952 0.9952
0.0336 0.22 2200 0.0297 0.9938 0.9938 0.9938 0.9938
0.0333 0.225 2250 0.0258 0.9946 0.9946 0.9946 0.9946
0.0302 0.23 2300 0.0294 0.9936 0.9936 0.9936 0.9936
0.0461 0.235 2350 0.0218 0.9950 0.9950 0.9950 0.9950
0.0439 0.24 2400 0.0308 0.9932 0.9932 0.9933 0.9932
0.0253 0.245 2450 0.0229 0.9958 0.9958 0.9958 0.9958
0.0206 0.25 2500 0.0257 0.9946 0.9946 0.9946 0.9946
0.0285 0.255 2550 0.0219 0.9958 0.9958 0.9958 0.9958
0.032 0.26 2600 0.0250 0.9950 0.9950 0.9950 0.9950
0.0467 0.265 2650 0.0296 0.9934 0.9934 0.9935 0.9934
0.0186 0.27 2700 0.0241 0.9952 0.9952 0.9952 0.9952
0.0377 0.275 2750 0.0252 0.9956 0.9956 0.9956 0.9956
0.0245 0.28 2800 0.0244 0.9952 0.9952 0.9952 0.9952
0.0238 0.285 2850 0.0216 0.9960 0.9960 0.9960 0.9960
0.0381 0.29 2900 0.0234 0.9956 0.9956 0.9956 0.9956
0.0215 0.295 2950 0.0242 0.9958 0.9958 0.9958 0.9958
0.0327 0.3 3000 0.0342 0.9926 0.9926 0.9927 0.9926
0.0274 0.305 3050 0.0229 0.9956 0.9956 0.9956 0.9956
0.0294 0.31 3100 0.0214 0.9956 0.9956 0.9956 0.9956
0.035 0.315 3150 0.0228 0.9946 0.9946 0.9946 0.9946
0.0321 0.32 3200 0.0198 0.9956 0.9956 0.9956 0.9956
0.0106 0.325 3250 0.0237 0.9952 0.9952 0.9952 0.9952
0.028 0.33 3300 0.0232 0.9958 0.9958 0.9958 0.9958
0.0275 0.335 3350 0.0262 0.9956 0.9956 0.9956 0.9956
0.0435 0.34 3400 0.0239 0.9952 0.9952 0.9952 0.9952
0.0444 0.345 3450 0.0234 0.9956 0.9956 0.9956 0.9956
0.051 0.35 3500 0.0206 0.9954 0.9954 0.9954 0.9954
0.0202 0.355 3550 0.0204 0.9958 0.9958 0.9958 0.9958
0.024 0.36 3600 0.0238 0.9956 0.9956 0.9956 0.9956
0.0265 0.365 3650 0.0239 0.9960 0.9960 0.9960 0.9960
0.0311 0.37 3700 0.0234 0.9962 0.9962 0.9962 0.9962
0.03 0.375 3750 0.0237 0.9956 0.9956 0.9956 0.9956
0.027 0.38 3800 0.0230 0.9958 0.9958 0.9958 0.9958
0.0356 0.385 3850 0.0199 0.9962 0.9962 0.9962 0.9962
0.0264 0.39 3900 0.0222 0.9964 0.9964 0.9964 0.9964
0.0162 0.395 3950 0.0225 0.9962 0.9962 0.9962 0.9962
0.0285 0.4 4000 0.0212 0.9962 0.9962 0.9962 0.9962
0.0158 0.405 4050 0.0222 0.9956 0.9956 0.9956 0.9956
0.0348 0.41 4100 0.0209 0.9966 0.9966 0.9966 0.9966
0.0102 0.415 4150 0.0207 0.9962 0.9962 0.9962 0.9962
0.0158 0.42 4200 0.0217 0.9960 0.9960 0.9960 0.9960
0.0297 0.425 4250 0.0234 0.9960 0.9960 0.9960 0.9960
0.0318 0.43 4300 0.0206 0.9960 0.9960 0.9960 0.9960
0.0269 0.435 4350 0.0204 0.9966 0.9966 0.9966 0.9966
0.0015 0.44 4400 0.0210 0.9960 0.9960 0.9960 0.9960
0.0068 0.445 4450 0.0219 0.9966 0.9966 0.9966 0.9966
0.0294 0.45 4500 0.0216 0.9962 0.9962 0.9962 0.9962
0.0276 0.455 4550 0.0219 0.9964 0.9964 0.9964 0.9964
0.0245 0.46 4600 0.0196 0.9958 0.9958 0.9958 0.9958
0.0149 0.465 4650 0.0198 0.9964 0.9964 0.9964 0.9964
0.0139 0.47 4700 0.0202 0.9960 0.9960 0.9960 0.9960
0.0206 0.475 4750 0.0215 0.9966 0.9966 0.9966 0.9966
0.0047 0.48 4800 0.0219 0.9958 0.9958 0.9958 0.9958
0.0153 0.485 4850 0.0206 0.9972 0.9972 0.9972 0.9972
0.0312 0.49 4900 0.0168 0.9970 0.9970 0.9970 0.9970
0.0308 0.495 4950 0.0195 0.9954 0.9954 0.9954 0.9954
0.0417 0.5 5000 0.0183 0.9960 0.9960 0.9960 0.9960
0.0105 0.505 5050 0.0194 0.9960 0.9960 0.9960 0.9960
0.0256 0.51 5100 0.0170 0.9968 0.9968 0.9968 0.9968
0.0114 0.515 5150 0.0183 0.9972 0.9972 0.9972 0.9972
0.0253 0.52 5200 0.0187 0.9962 0.9962 0.9962 0.9962
0.0256 0.525 5250 0.0196 0.9964 0.9964 0.9964 0.9964
0.0527 0.53 5300 0.0171 0.9962 0.9962 0.9962 0.9962
0.0236 0.535 5350 0.0173 0.9970 0.9970 0.9970 0.9970
0.0199 0.54 5400 0.0168 0.9962 0.9962 0.9962 0.9962
0.0145 0.545 5450 0.0220 0.9958 0.9958 0.9958 0.9958
0.0208 0.55 5500 0.0215 0.9952 0.9952 0.9952 0.9952
0.0234 0.555 5550 0.0192 0.9958 0.9958 0.9958 0.9958
0.0214 0.56 5600 0.0171 0.9964 0.9964 0.9964 0.9964
0.0226 0.565 5650 0.0169 0.9966 0.9966 0.9966 0.9966
0.0089 0.57 5700 0.0174 0.9968 0.9968 0.9968 0.9968
0.0215 0.575 5750 0.0192 0.9960 0.9960 0.9960 0.9960
0.0034 0.58 5800 0.0202 0.9956 0.9956 0.9956 0.9956
0.0275 0.585 5850 0.0178 0.9966 0.9966 0.9966 0.9966
0.0205 0.59 5900 0.0169 0.9962 0.9962 0.9962 0.9962
0.0089 0.595 5950 0.0184 0.9960 0.9960 0.9960 0.9960
0.0185 0.6 6000 0.0168 0.9968 0.9968 0.9968 0.9968
0.0193 0.605 6050 0.0159 0.9970 0.9970 0.9970 0.9970
0.0071 0.61 6100 0.0170 0.9970 0.9970 0.9970 0.9970
0.009 0.615 6150 0.0166 0.9974 0.9974 0.9974 0.9974
0.0222 0.62 6200 0.0167 0.9964 0.9964 0.9964 0.9964
0.0129 0.625 6250 0.0200 0.9966 0.9966 0.9966 0.9966
0.0107 0.63 6300 0.0182 0.9966 0.9966 0.9966 0.9966
0.0161 0.635 6350 0.0186 0.9968 0.9968 0.9968 0.9968
0.02 0.64 6400 0.0178 0.9968 0.9968 0.9968 0.9968
0.011 0.645 6450 0.0173 0.9966 0.9966 0.9966 0.9966
0.0172 0.65 6500 0.0193 0.9964 0.9964 0.9964 0.9964
0.0265 0.655 6550 0.0180 0.9968 0.9968 0.9968 0.9968
0.0248 0.66 6600 0.0176 0.9968 0.9968 0.9968 0.9968
0.0265 0.665 6650 0.0182 0.9968 0.9968 0.9968 0.9968
0.0062 0.67 6700 0.0196 0.9966 0.9966 0.9966 0.9966
0.0066 0.675 6750 0.0206 0.9962 0.9962 0.9962 0.9962
0.0346 0.68 6800 0.0185 0.9968 0.9968 0.9968 0.9968
0.0337 0.685 6850 0.0167 0.9968 0.9968 0.9968 0.9968
0.0128 0.69 6900 0.0189 0.9960 0.9960 0.9960 0.9960
0.0132 0.695 6950 0.0166 0.9968 0.9968 0.9968 0.9968
0.0153 0.7 7000 0.0166 0.9972 0.9972 0.9972 0.9972
0.0034 0.705 7050 0.0175 0.9972 0.9972 0.9972 0.9972
0.0046 0.71 7100 0.0180 0.9966 0.9966 0.9966 0.9966
0.0221 0.715 7150 0.0168 0.9970 0.9970 0.9970 0.9970
0.0089 0.72 7200 0.0192 0.9966 0.9966 0.9966 0.9966
0.0222 0.725 7250 0.0165 0.9968 0.9968 0.9968 0.9968
0.0051 0.73 7300 0.0172 0.9970 0.9970 0.9970 0.9970
0.0191 0.735 7350 0.0162 0.9966 0.9966 0.9966 0.9966
0.0136 0.74 7400 0.0165 0.9972 0.9972 0.9972 0.9972
0.0108 0.745 7450 0.0189 0.9962 0.9962 0.9962 0.9962
0.0265 0.75 7500 0.0175 0.9970 0.9970 0.9970 0.9970
0.0038 0.755 7550 0.0187 0.9964 0.9964 0.9964 0.9964
0.0074 0.76 7600 0.0172 0.9972 0.9972 0.9972 0.9972
0.0258 0.765 7650 0.0169 0.9972 0.9972 0.9972 0.9972
0.0378 0.77 7700 0.0175 0.9968 0.9968 0.9968 0.9968
0.0035 0.775 7750 0.0186 0.9966 0.9966 0.9966 0.9966
0.013 0.78 7800 0.0206 0.9964 0.9964 0.9964 0.9964
0.003 0.785 7850 0.0178 0.9972 0.9972 0.9972 0.9972
0.0204 0.79 7900 0.0189 0.9968 0.9968 0.9968 0.9968
0.0233 0.795 7950 0.0173 0.9972 0.9972 0.9972 0.9972
0.0172 0.8 8000 0.0181 0.9968 0.9968 0.9968 0.9968
0.0053 0.805 8050 0.0206 0.9964 0.9964 0.9964 0.9964
0.0093 0.81 8100 0.0195 0.9964 0.9964 0.9964 0.9964
0.0094 0.815 8150 0.0180 0.9970 0.9970 0.9970 0.9970
0.0096 0.82 8200 0.0188 0.9968 0.9968 0.9968 0.9968
0.0099 0.825 8250 0.0184 0.9966 0.9966 0.9966 0.9966
0.0099 0.83 8300 0.0188 0.9966 0.9966 0.9966 0.9966
0.0034 0.835 8350 0.0181 0.9966 0.9966 0.9966 0.9966
0.0104 0.84 8400 0.0180 0.9970 0.9970 0.9970 0.9970
0.0058 0.845 8450 0.0185 0.9970 0.9970 0.9970 0.9970
0.0074 0.85 8500 0.0191 0.9968 0.9968 0.9968 0.9968
0.019 0.855 8550 0.0179 0.9972 0.9972 0.9972 0.9972
0.0189 0.86 8600 0.0182 0.9970 0.9970 0.9970 0.9970
0.0142 0.865 8650 0.0182 0.9966 0.9966 0.9966 0.9966
0.0079 0.87 8700 0.0186 0.9966 0.9966 0.9966 0.9966
0.0191 0.875 8750 0.0180 0.9966 0.9966 0.9966 0.9966
0.0032 0.88 8800 0.0191 0.9964 0.9964 0.9964 0.9964
0.0105 0.885 8850 0.0185 0.9966 0.9966 0.9966 0.9966
0.0054 0.89 8900 0.0172 0.9968 0.9968 0.9968 0.9968
0.0076 0.895 8950 0.0170 0.9968 0.9968 0.9968 0.9968
0.016 0.9 9000 0.0177 0.9966 0.9966 0.9966 0.9966
0.0041 0.905 9050 0.0173 0.9966 0.9966 0.9966 0.9966
0.0105 0.91 9100 0.0174 0.9966 0.9966 0.9966 0.9966
0.018 0.915 9150 0.0182 0.9964 0.9964 0.9964 0.9964
0.0056 0.92 9200 0.0178 0.9964 0.9964 0.9964 0.9964
0.0064 0.925 9250 0.0178 0.9964 0.9964 0.9964 0.9964
0.0107 0.93 9300 0.0175 0.9970 0.9970 0.9970 0.9970
0.0083 0.935 9350 0.0179 0.9966 0.9966 0.9966 0.9966
0.0065 0.94 9400 0.0180 0.9966 0.9966 0.9966 0.9966
0.0117 0.945 9450 0.0179 0.9966 0.9966 0.9966 0.9966
0.0108 0.95 9500 0.0182 0.9964 0.9964 0.9964 0.9964
0.0135 0.955 9550 0.0184 0.9964 0.9964 0.9964 0.9964
0.0108 0.96 9600 0.0186 0.9964 0.9964 0.9964 0.9964
0.0075 0.965 9650 0.0185 0.9964 0.9964 0.9964 0.9964
0.0308 0.97 9700 0.0187 0.9964 0.9964 0.9964 0.9964
0.0012 0.975 9750 0.0183 0.9964 0.9964 0.9964 0.9964
0.0002 0.98 9800 0.0187 0.9964 0.9964 0.9964 0.9964
0.0106 0.985 9850 0.0186 0.9964 0.9964 0.9964 0.9964
0.0046 0.99 9900 0.0187 0.9964 0.9964 0.9964 0.9964
0.0206 0.995 9950 0.0187 0.9964 0.9964 0.9964 0.9964
0.0112 1.0 10000 0.0187 0.9964 0.9964 0.9964 0.9964

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.4.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
662
Safetensors
Model size
86.9M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for amaye15/SwinV2-Base-Document-Classifier

Finetuned
this model