BilalMuftuoglu's picture
End of training
d60d9a6 verified
|
raw
history blame
7.78 kB
metadata
license: apache-2.0
base_model: facebook/deit-base-distilled-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: deit-base-distilled-patch16-224-65-fold4
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8169014084507042

deit-base-distilled-patch16-224-65-fold4

This model is a fine-tuned version of facebook/deit-base-distilled-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7218
  • Accuracy: 0.8169

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9231 3 0.7266 0.4648
No log 1.8462 6 0.8116 0.5211
No log 2.7692 9 0.7081 0.4648
0.7173 4.0 13 0.6645 0.5634
0.7173 4.9231 16 0.6441 0.5915
0.7173 5.8462 19 0.6400 0.6761
0.6351 6.7692 22 0.6055 0.6620
0.6351 8.0 26 0.7770 0.5352
0.6351 8.9231 29 0.6259 0.6901
0.5434 9.8462 32 0.5889 0.7183
0.5434 10.7692 35 0.7283 0.6479
0.5434 12.0 39 0.6898 0.6479
0.4861 12.9231 42 0.6429 0.7183
0.4861 13.8462 45 0.6915 0.6620
0.4861 14.7692 48 0.5702 0.7183
0.4285 16.0 52 0.6356 0.7042
0.4285 16.9231 55 0.6981 0.6761
0.4285 17.8462 58 0.5218 0.7183
0.3781 18.7692 61 0.5340 0.7183
0.3781 20.0 65 0.7611 0.6761
0.3781 20.9231 68 0.5939 0.7465
0.3516 21.8462 71 0.6186 0.7887
0.3516 22.7692 74 0.7122 0.7042
0.3516 24.0 78 0.5931 0.7887
0.296 24.9231 81 0.6305 0.6901
0.296 25.8462 84 0.8947 0.7042
0.296 26.7692 87 0.6217 0.7183
0.2741 28.0 91 0.7218 0.8169
0.2741 28.9231 94 0.6687 0.7887
0.2741 29.8462 97 0.6648 0.8028
0.2559 30.7692 100 0.6433 0.7746
0.2559 32.0 104 0.6674 0.7324
0.2559 32.9231 107 0.6643 0.7465
0.2001 33.8462 110 0.6247 0.7465
0.2001 34.7692 113 0.6344 0.6901
0.2001 36.0 117 0.7072 0.7606
0.1728 36.9231 120 0.7146 0.7465
0.1728 37.8462 123 0.8212 0.7606
0.1728 38.7692 126 0.7901 0.7324
0.2109 40.0 130 0.8235 0.7465
0.2109 40.9231 133 0.9196 0.6901
0.2109 41.8462 136 0.7758 0.7606
0.2109 42.7692 139 0.7692 0.7183
0.1634 44.0 143 0.8310 0.7606
0.1634 44.9231 146 0.7550 0.7465
0.1634 45.8462 149 0.7646 0.7324
0.148 46.7692 152 0.7208 0.7606
0.148 48.0 156 0.7324 0.7606
0.148 48.9231 159 0.7856 0.7606
0.1568 49.8462 162 0.8033 0.7606
0.1568 50.7692 165 0.9007 0.7746
0.1568 52.0 169 0.8179 0.7606
0.1659 52.9231 172 0.7775 0.7606
0.1659 53.8462 175 0.7214 0.7606
0.1659 54.7692 178 0.7385 0.7465
0.1352 56.0 182 0.7434 0.7465
0.1352 56.9231 185 0.8971 0.7042
0.1352 57.8462 188 0.7821 0.7606
0.1309 58.7692 191 0.7896 0.7465
0.1309 60.0 195 0.8340 0.7465
0.1309 60.9231 198 0.8154 0.7746
0.1201 61.8462 201 0.8185 0.7606
0.1201 62.7692 204 0.9640 0.7183
0.1201 64.0 208 0.8485 0.7606
0.1291 64.9231 211 0.8807 0.7324
0.1291 65.8462 214 0.8653 0.7183
0.1291 66.7692 217 0.8744 0.7324
0.124 68.0 221 0.8723 0.7324
0.124 68.9231 224 0.8948 0.7606
0.124 69.8462 227 0.9777 0.7183
0.1262 70.7692 230 0.9409 0.7746
0.1262 72.0 234 0.9618 0.7465
0.1262 72.9231 237 0.9642 0.7606
0.1036 73.8462 240 0.9738 0.7465
0.1036 74.7692 243 0.9788 0.7324
0.1036 76.0 247 1.0114 0.7465
0.1183 76.9231 250 1.0004 0.7465
0.1183 77.8462 253 1.0407 0.7465
0.1183 78.7692 256 1.1510 0.7324
0.0981 80.0 260 1.0718 0.7465
0.0981 80.9231 263 0.9988 0.7324
0.0981 81.8462 266 1.0054 0.7042
0.0981 82.7692 269 0.9896 0.7324
0.106 84.0 273 0.9851 0.7324
0.106 84.9231 276 0.9770 0.7465
0.106 85.8462 279 0.9623 0.7183
0.114 86.7692 282 0.9664 0.7042
0.114 88.0 286 0.9780 0.7042
0.114 88.9231 289 0.9670 0.7183
0.1157 89.8462 292 0.9586 0.7324
0.1157 90.7692 295 0.9587 0.7183
0.1157 92.0 299 0.9611 0.7042
0.0834 92.3077 300 0.9612 0.7042

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1