Niraya666's picture
End of training
aee3631
metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8285714285714286

swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6771
  • Accuracy: 0.8286

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 0.6875 0.8143
No log 2.0 4 0.6874 0.8143
No log 3.0 6 0.6873 0.8143
No log 4.0 8 0.6871 0.8143
0.7555 5.0 10 0.6869 0.8143
0.7555 6.0 12 0.6866 0.8143
0.7555 7.0 14 0.6862 0.8143
0.7555 8.0 16 0.6858 0.8143
0.7555 9.0 18 0.6853 0.8143
0.7576 10.0 20 0.6848 0.8143
0.7576 11.0 22 0.6842 0.8143
0.7576 12.0 24 0.6836 0.8143
0.7576 13.0 26 0.6830 0.8143
0.7576 14.0 28 0.6823 0.8143
0.769 15.0 30 0.6816 0.8
0.769 16.0 32 0.6808 0.8
0.769 17.0 34 0.6800 0.8143
0.769 18.0 36 0.6791 0.8143
0.769 19.0 38 0.6781 0.8143
0.7564 20.0 40 0.6771 0.8286
0.7564 21.0 42 0.6760 0.8143
0.7564 22.0 44 0.6748 0.8143
0.7564 23.0 46 0.6737 0.8
0.7564 24.0 48 0.6725 0.8
0.7508 25.0 50 0.6713 0.8143
0.7508 26.0 52 0.6701 0.8143
0.7508 27.0 54 0.6689 0.8143
0.7508 28.0 56 0.6674 0.8143
0.7508 29.0 58 0.6660 0.8143
0.747 30.0 60 0.6646 0.8143
0.747 31.0 62 0.6631 0.8143
0.747 32.0 64 0.6616 0.8143
0.747 33.0 66 0.6601 0.8143
0.747 34.0 68 0.6586 0.8143
0.7343 35.0 70 0.6570 0.8143
0.7343 36.0 72 0.6553 0.8143
0.7343 37.0 74 0.6536 0.8143
0.7343 38.0 76 0.6517 0.8143
0.7343 39.0 78 0.6499 0.8143
0.7532 40.0 80 0.6480 0.8143
0.7532 41.0 82 0.6461 0.8143
0.7532 42.0 84 0.6442 0.8143
0.7532 43.0 86 0.6423 0.8143
0.7532 44.0 88 0.6405 0.8143
0.7239 45.0 90 0.6387 0.8143
0.7239 46.0 92 0.6368 0.8143
0.7239 47.0 94 0.6352 0.8143
0.7239 48.0 96 0.6337 0.8143
0.7239 49.0 98 0.6321 0.8286
0.7085 50.0 100 0.6307 0.8286
0.7085 51.0 102 0.6294 0.8286
0.7085 52.0 104 0.6278 0.8286
0.7085 53.0 106 0.6263 0.8286
0.7085 54.0 108 0.6248 0.8143
0.7203 55.0 110 0.6233 0.8143
0.7203 56.0 112 0.6219 0.8143
0.7203 57.0 114 0.6205 0.8143
0.7203 58.0 116 0.6191 0.8143
0.7203 59.0 118 0.6179 0.8143
0.7136 60.0 120 0.6167 0.8143
0.7136 61.0 122 0.6157 0.8143
0.7136 62.0 124 0.6148 0.8
0.7136 63.0 126 0.6138 0.8
0.7136 64.0 128 0.6125 0.8
0.7123 65.0 130 0.6111 0.8
0.7123 66.0 132 0.6096 0.8143
0.7123 67.0 134 0.6083 0.8143
0.7123 68.0 136 0.6070 0.8143
0.7123 69.0 138 0.6057 0.8143
0.7076 70.0 140 0.6046 0.8143
0.7076 71.0 142 0.6035 0.8143
0.7076 72.0 144 0.6023 0.8143
0.7076 73.0 146 0.6011 0.8143
0.7076 74.0 148 0.5999 0.8143
0.6878 75.0 150 0.5988 0.8143
0.6878 76.0 152 0.5975 0.8143
0.6878 77.0 154 0.5964 0.8143
0.6878 78.0 156 0.5953 0.8143
0.6878 79.0 158 0.5942 0.8143
0.6657 80.0 160 0.5932 0.8143
0.6657 81.0 162 0.5923 0.8143
0.6657 82.0 164 0.5914 0.8143
0.6657 83.0 166 0.5906 0.8143
0.6657 84.0 168 0.5897 0.8143
0.6434 85.0 170 0.5888 0.8143
0.6434 86.0 172 0.5878 0.8143
0.6434 87.0 174 0.5868 0.8143
0.6434 88.0 176 0.5859 0.8143
0.6434 89.0 178 0.5851 0.8143
0.6825 90.0 180 0.5843 0.8143
0.6825 91.0 182 0.5836 0.8143
0.6825 92.0 184 0.5828 0.8143
0.6825 93.0 186 0.5823 0.8143
0.6825 94.0 188 0.5817 0.8286
0.6695 95.0 190 0.5809 0.8143
0.6695 96.0 192 0.5801 0.8143
0.6695 97.0 194 0.5793 0.8143
0.6695 98.0 196 0.5787 0.8143
0.6695 99.0 198 0.5780 0.8143
0.6672 100.0 200 0.5772 0.8143
0.6672 101.0 202 0.5762 0.8143
0.6672 102.0 204 0.5754 0.8143
0.6672 103.0 206 0.5746 0.8143
0.6672 104.0 208 0.5738 0.8143
0.6569 105.0 210 0.5731 0.8143
0.6569 106.0 212 0.5724 0.8143
0.6569 107.0 214 0.5716 0.8143
0.6569 108.0 216 0.5708 0.8143
0.6569 109.0 218 0.5701 0.8143
0.6748 110.0 220 0.5694 0.8143
0.6748 111.0 222 0.5687 0.8143
0.6748 112.0 224 0.5680 0.8143
0.6748 113.0 226 0.5674 0.8143
0.6748 114.0 228 0.5668 0.8143
0.6388 115.0 230 0.5662 0.8143
0.6388 116.0 232 0.5657 0.8143
0.6388 117.0 234 0.5652 0.8143
0.6388 118.0 236 0.5648 0.8286
0.6388 119.0 238 0.5645 0.8286
0.6551 120.0 240 0.5641 0.8286
0.6551 121.0 242 0.5636 0.8143
0.6551 122.0 244 0.5631 0.8143
0.6551 123.0 246 0.5627 0.8143
0.6551 124.0 248 0.5624 0.8143
0.6452 125.0 250 0.5622 0.8143
0.6452 126.0 252 0.5620 0.8143
0.6452 127.0 254 0.5618 0.8143
0.6452 128.0 256 0.5615 0.8143
0.6452 129.0 258 0.5613 0.8143
0.645 130.0 260 0.5611 0.8143
0.645 131.0 262 0.5608 0.8143
0.645 132.0 264 0.5606 0.8143
0.645 133.0 266 0.5602 0.8143
0.645 134.0 268 0.5596 0.8143
0.629 135.0 270 0.5590 0.8143
0.629 136.0 272 0.5582 0.8143
0.629 137.0 274 0.5576 0.8143
0.629 138.0 276 0.5571 0.8143
0.629 139.0 278 0.5568 0.8143
0.7126 140.0 280 0.5565 0.8143
0.7126 141.0 282 0.5563 0.8143
0.7126 142.0 284 0.5561 0.8143
0.7126 143.0 286 0.5559 0.8143
0.7126 144.0 288 0.5555 0.8143
0.669 145.0 290 0.5552 0.8143
0.669 146.0 292 0.5547 0.8143
0.669 147.0 294 0.5542 0.8143
0.669 148.0 296 0.5538 0.8143
0.669 149.0 298 0.5534 0.8143
0.6481 150.0 300 0.5530 0.8143
0.6481 151.0 302 0.5526 0.8143
0.6481 152.0 304 0.5522 0.8143
0.6481 153.0 306 0.5519 0.8143
0.6481 154.0 308 0.5515 0.8143
0.6211 155.0 310 0.5510 0.8143
0.6211 156.0 312 0.5506 0.8143
0.6211 157.0 314 0.5502 0.8143
0.6211 158.0 316 0.5499 0.8143
0.6211 159.0 318 0.5496 0.8143
0.6458 160.0 320 0.5492 0.8286
0.6458 161.0 322 0.5490 0.8143
0.6458 162.0 324 0.5488 0.8143
0.6458 163.0 326 0.5486 0.8143
0.6458 164.0 328 0.5484 0.8143
0.6317 165.0 330 0.5481 0.8143
0.6317 166.0 332 0.5479 0.8286
0.6317 167.0 334 0.5476 0.8286
0.6317 168.0 336 0.5473 0.8286
0.6317 169.0 338 0.5471 0.8286
0.6154 170.0 340 0.5470 0.8286
0.6154 171.0 342 0.5468 0.8286
0.6154 172.0 344 0.5466 0.8286
0.6154 173.0 346 0.5464 0.8286
0.6154 174.0 348 0.5462 0.8286
0.6323 175.0 350 0.5460 0.8286
0.6323 176.0 352 0.5459 0.8286
0.6323 177.0 354 0.5457 0.8286
0.6323 178.0 356 0.5456 0.8286
0.6323 179.0 358 0.5455 0.8286
0.6331 180.0 360 0.5453 0.8286
0.6331 181.0 362 0.5452 0.8286
0.6331 182.0 364 0.5451 0.8286
0.6331 183.0 366 0.5449 0.8286
0.6331 184.0 368 0.5448 0.8286
0.6333 185.0 370 0.5447 0.8286
0.6333 186.0 372 0.5447 0.8286
0.6333 187.0 374 0.5446 0.8286
0.6333 188.0 376 0.5445 0.8286
0.6333 189.0 378 0.5445 0.8286
0.608 190.0 380 0.5444 0.8286
0.608 191.0 382 0.5444 0.8286
0.608 192.0 384 0.5443 0.8286
0.608 193.0 386 0.5443 0.8286
0.608 194.0 388 0.5442 0.8286
0.6155 195.0 390 0.5442 0.8286
0.6155 196.0 392 0.5442 0.8286
0.6155 197.0 394 0.5442 0.8286
0.6155 198.0 396 0.5441 0.8286
0.6155 199.0 398 0.5441 0.8286
0.6272 200.0 400 0.5441 0.8286

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3