BilalMuftuoglu's picture
End of training
38faa9b verified
metadata
license: apache-2.0
base_model: facebook/deit-base-distilled-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: deit-base-distilled-patch16-224-hasta-75-fold2
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9166666666666666

deit-base-distilled-patch16-224-hasta-75-fold2

This model is a fine-tuned version of facebook/deit-base-distilled-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5330
  • Accuracy: 0.9167

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.8721 0.5833
No log 2.0 2 0.7324 0.8333
No log 3.0 3 0.5330 0.9167
No log 4.0 4 0.4022 0.9167
No log 5.0 5 0.3794 0.9167
No log 6.0 6 0.3910 0.9167
No log 7.0 7 0.3832 0.9167
No log 8.0 8 0.3524 0.9167
No log 9.0 9 0.4280 0.9167
0.3237 10.0 10 0.5286 0.8333
0.3237 11.0 11 0.4004 0.9167
0.3237 12.0 12 0.3327 0.9167
0.3237 13.0 13 0.3136 0.9167
0.3237 14.0 14 0.2844 0.9167
0.3237 15.0 15 0.2493 0.9167
0.3237 16.0 16 0.2190 0.9167
0.3237 17.0 17 0.1925 0.9167
0.3237 18.0 18 0.1722 0.9167
0.3237 19.0 19 0.1387 0.9167
0.142 20.0 20 0.1259 0.9167
0.142 21.0 21 0.1443 0.9167
0.142 22.0 22 0.1372 0.9167
0.142 23.0 23 0.1043 0.9167
0.142 24.0 24 0.1022 0.9167
0.142 25.0 25 0.1327 0.9167
0.142 26.0 26 0.2213 0.9167
0.142 27.0 27 0.2587 0.9167
0.142 28.0 28 0.2411 0.9167
0.142 29.0 29 0.1915 0.9167
0.0723 30.0 30 0.1418 0.9167
0.0723 31.0 31 0.1369 0.9167
0.0723 32.0 32 0.1749 0.9167
0.0723 33.0 33 0.2607 0.9167
0.0723 34.0 34 0.3049 0.9167
0.0723 35.0 35 0.3103 0.9167
0.0723 36.0 36 0.2972 0.9167
0.0723 37.0 37 0.2901 0.9167
0.0723 38.0 38 0.2490 0.9167
0.0723 39.0 39 0.2047 0.9167
0.0458 40.0 40 0.1781 0.9167
0.0458 41.0 41 0.1712 0.9167
0.0458 42.0 42 0.2114 0.9167
0.0458 43.0 43 0.2837 0.9167
0.0458 44.0 44 0.3335 0.9167
0.0458 45.0 45 0.3600 0.9167
0.0458 46.0 46 0.3698 0.9167
0.0458 47.0 47 0.3607 0.9167
0.0458 48.0 48 0.3493 0.9167
0.0458 49.0 49 0.3408 0.9167
0.0478 50.0 50 0.3538 0.9167
0.0478 51.0 51 0.3481 0.9167
0.0478 52.0 52 0.3513 0.9167
0.0478 53.0 53 0.3336 0.9167
0.0478 54.0 54 0.3044 0.9167
0.0478 55.0 55 0.2844 0.9167
0.0478 56.0 56 0.2790 0.9167
0.0478 57.0 57 0.2990 0.9167
0.0478 58.0 58 0.3265 0.9167
0.0478 59.0 59 0.3682 0.9167
0.0145 60.0 60 0.3938 0.9167
0.0145 61.0 61 0.4018 0.9167
0.0145 62.0 62 0.3817 0.9167
0.0145 63.0 63 0.3376 0.9167
0.0145 64.0 64 0.2812 0.9167
0.0145 65.0 65 0.2029 0.9167
0.0145 66.0 66 0.1343 0.9167
0.0145 67.0 67 0.0996 0.9167
0.0145 68.0 68 0.0811 0.9167
0.0145 69.0 69 0.0662 0.9167
0.0447 70.0 70 0.0745 0.9167
0.0447 71.0 71 0.1053 0.9167
0.0447 72.0 72 0.1643 0.9167
0.0447 73.0 73 0.2353 0.9167
0.0447 74.0 74 0.3155 0.9167
0.0447 75.0 75 0.3678 0.9167
0.0447 76.0 76 0.3946 0.9167
0.0447 77.0 77 0.4025 0.9167
0.0447 78.0 78 0.4106 0.9167
0.0447 79.0 79 0.4147 0.9167
0.0229 80.0 80 0.4108 0.9167
0.0229 81.0 81 0.3993 0.9167
0.0229 82.0 82 0.3857 0.9167
0.0229 83.0 83 0.3644 0.9167
0.0229 84.0 84 0.3422 0.9167
0.0229 85.0 85 0.3280 0.9167
0.0229 86.0 86 0.3108 0.9167
0.0229 87.0 87 0.2936 0.9167
0.0229 88.0 88 0.2846 0.9167
0.0229 89.0 89 0.2861 0.9167
0.0317 90.0 90 0.2909 0.9167
0.0317 91.0 91 0.2921 0.9167
0.0317 92.0 92 0.2938 0.9167
0.0317 93.0 93 0.2974 0.9167
0.0317 94.0 94 0.2998 0.9167
0.0317 95.0 95 0.2994 0.9167
0.0317 96.0 96 0.2992 0.9167
0.0317 97.0 97 0.2973 0.9167
0.0317 98.0 98 0.2969 0.9167
0.0317 99.0 99 0.2970 0.9167
0.0283 100.0 100 0.2973 0.9167

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1