BilalMuftuoglu's picture
End of training
5ac5a3c verified
|
raw
history blame
No virus
7.78 kB
metadata
license: apache-2.0
base_model: facebook/deit-base-distilled-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: deit-base-distilled-patch16-224-65-fold4
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8732394366197183

deit-base-distilled-patch16-224-65-fold4

This model is a fine-tuned version of facebook/deit-base-distilled-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4832
  • Accuracy: 0.8732

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9231 3 0.7219 0.5493
No log 1.8462 6 0.6863 0.5493
No log 2.7692 9 0.6578 0.5493
0.6825 4.0 13 0.6293 0.6338
0.6825 4.9231 16 0.6186 0.6761
0.6825 5.8462 19 0.6135 0.7042
0.6206 6.7692 22 0.6163 0.6479
0.6206 8.0 26 0.6350 0.6479
0.6206 8.9231 29 0.6078 0.6901
0.5728 9.8462 32 0.6873 0.6761
0.5728 10.7692 35 0.6771 0.6761
0.5728 12.0 39 0.5912 0.6620
0.5329 12.9231 42 0.5524 0.7465
0.5329 13.8462 45 0.5923 0.7183
0.5329 14.7692 48 0.6650 0.6761
0.4279 16.0 52 0.5183 0.7746
0.4279 16.9231 55 0.4761 0.7887
0.4279 17.8462 58 0.5590 0.7183
0.4055 18.7692 61 0.5320 0.7465
0.4055 20.0 65 0.6605 0.7183
0.4055 20.9231 68 0.5821 0.7465
0.3478 21.8462 71 0.5589 0.7465
0.3478 22.7692 74 0.6247 0.7465
0.3478 24.0 78 0.7006 0.6620
0.3769 24.9231 81 0.7575 0.7183
0.3769 25.8462 84 0.5367 0.7746
0.3769 26.7692 87 0.5335 0.7746
0.2957 28.0 91 0.5914 0.7606
0.2957 28.9231 94 0.6780 0.7465
0.2957 29.8462 97 0.5345 0.7746
0.2463 30.7692 100 0.6132 0.7606
0.2463 32.0 104 0.5758 0.7887
0.2463 32.9231 107 0.7236 0.7324
0.2323 33.8462 110 0.5247 0.8169
0.2323 34.7692 113 0.6018 0.7183
0.2323 36.0 117 0.5366 0.8028
0.1921 36.9231 120 0.6314 0.7465
0.1921 37.8462 123 0.5763 0.7746
0.1921 38.7692 126 0.5574 0.8028
0.1686 40.0 130 0.6261 0.7887
0.1686 40.9231 133 0.6525 0.7887
0.1686 41.8462 136 0.5726 0.8169
0.1686 42.7692 139 0.8200 0.7042
0.2073 44.0 143 0.4798 0.7887
0.2073 44.9231 146 0.5342 0.7887
0.2073 45.8462 149 0.4834 0.7887
0.1702 46.7692 152 0.6101 0.7465
0.1702 48.0 156 0.4779 0.8169
0.1702 48.9231 159 0.5048 0.7887
0.153 49.8462 162 0.6298 0.7465
0.153 50.7692 165 0.5995 0.7606
0.153 52.0 169 0.6475 0.7042
0.1508 52.9231 172 0.4888 0.8028
0.1508 53.8462 175 0.4954 0.8310
0.1508 54.7692 178 0.4390 0.8028
0.1293 56.0 182 0.4778 0.8592
0.1293 56.9231 185 0.4888 0.8310
0.1293 57.8462 188 0.4832 0.8732
0.1489 58.7692 191 0.5277 0.8310
0.1489 60.0 195 0.6217 0.7324
0.1489 60.9231 198 0.6090 0.7465
0.1487 61.8462 201 0.5424 0.8451
0.1487 62.7692 204 0.5570 0.8451
0.1487 64.0 208 0.7248 0.7183
0.1456 64.9231 211 0.5841 0.7887
0.1456 65.8462 214 0.5905 0.8310
0.1456 66.7692 217 0.5609 0.8310
0.1284 68.0 221 0.5470 0.8028
0.1284 68.9231 224 0.5473 0.8310
0.1284 69.8462 227 0.5813 0.8310
0.1225 70.7692 230 0.5683 0.8451
0.1225 72.0 234 0.5581 0.8310
0.1225 72.9231 237 0.5717 0.7887
0.1233 73.8462 240 0.6054 0.7606
0.1233 74.7692 243 0.5910 0.7887
0.1233 76.0 247 0.5707 0.8169
0.1234 76.9231 250 0.5733 0.8028
0.1234 77.8462 253 0.5748 0.8028
0.1234 78.7692 256 0.5723 0.7887
0.1219 80.0 260 0.5503 0.8169
0.1219 80.9231 263 0.5532 0.8028
0.1219 81.8462 266 0.5828 0.7746
0.1219 82.7692 269 0.6062 0.7746
0.1075 84.0 273 0.5752 0.8028
0.1075 84.9231 276 0.5748 0.8028
0.1075 85.8462 279 0.5776 0.8169
0.1013 86.7692 282 0.5844 0.8028
0.1013 88.0 286 0.5930 0.8028
0.1013 88.9231 289 0.6020 0.7887
0.1092 89.8462 292 0.6055 0.7746
0.1092 90.7692 295 0.6075 0.7746
0.1092 92.0 299 0.6080 0.7746
0.1096 92.3077 300 0.6081 0.7746

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1