Edit model card

car_identified_model_9

This model is a fine-tuned version of apple/mobilevitv2-1.0-imagenet1k-256 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3580
  • F1: 0.8333
  • Roc Auc: 0.8333
  • Accuracy: 0.6667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.2586 1.0 1 0.6925 0.3889 0.5417 0.0
0.2586 2.0 2 0.6926 0.3889 0.5417 0.0
0.2586 3.0 4 0.6924 0.3889 0.5417 0.0
0.2586 4.0 5 0.6922 0.3889 0.5417 0.0
0.2586 5.0 6 0.6917 0.5 0.5 0.0
0.2586 6.0 8 0.6904 0.6667 0.5833 0.0
0.2586 7.0 9 0.6890 0.6667 0.5833 0.0
0.2586 8.0 10 0.6871 0.6667 0.5833 0.0
0.2586 9.0 11 0.6845 0.6667 0.5833 0.0
0.2586 10.0 12 0.6813 0.6667 0.5833 0.0
0.2586 11.0 14 0.6764 0.6667 0.5833 0.0
0.2586 12.0 15 0.6724 0.6667 0.5833 0.0
0.2586 13.0 16 0.6689 0.6667 0.5833 0.0
0.2586 14.0 18 0.6653 0.6667 0.5833 0.0
0.2586 15.0 19 0.6634 0.6667 0.5833 0.0
0.2586 16.0 20 0.6618 0.625 0.625 0.3333
0.2586 17.0 21 0.6601 0.625 0.625 0.3333
0.2586 18.0 22 0.6586 0.625 0.625 0.3333
0.2586 19.0 24 0.6844 0.3889 0.5417 0.0
0.2586 20.0 25 0.8059 0.4583 0.4583 0.25
0.2586 21.0 26 0.9269 0.4583 0.4583 0.25
0.2586 22.0 28 1.0221 0.4583 0.4583 0.25
0.2586 23.0 29 1.0359 0.4583 0.4583 0.25
0.2586 24.0 30 1.0373 0.4583 0.4583 0.25
0.2586 25.0 31 1.0350 0.4583 0.4583 0.25
0.2586 26.0 32 0.9747 0.4583 0.4583 0.25
0.2586 27.0 34 0.8764 0.5417 0.5417 0.3333
0.2586 28.0 35 0.7686 0.5417 0.5417 0.3333
0.2586 29.0 36 0.6511 0.6667 0.6667 0.4167
0.2586 30.0 38 0.5987 0.75 0.75 0.5
0.2586 31.0 39 0.5267 0.75 0.75 0.5
0.2586 32.0 40 0.4412 0.7917 0.7917 0.5833
0.2586 33.0 41 0.3719 0.875 0.875 0.75
0.2586 34.0 42 0.3447 0.875 0.875 0.75
0.2586 35.0 44 0.3333 0.875 0.875 0.75
0.2586 36.0 45 0.3295 0.875 0.875 0.75
0.2586 37.0 46 0.3310 0.8980 0.8958 0.75
0.2586 38.0 48 0.3435 0.8980 0.8958 0.75
0.2586 39.0 49 0.3457 0.8980 0.8958 0.75
0.2586 40.0 50 0.3664 0.9167 0.9167 0.8333
0.2586 41.0 51 0.3809 0.8571 0.8542 0.6667
0.2586 42.0 52 0.4175 0.8400 0.8333 0.5833
0.2586 43.0 54 0.4183 0.8163 0.8125 0.5833
0.2586 44.0 55 0.4444 0.7755 0.7708 0.5833
0.2586 45.0 56 0.4301 0.8333 0.8333 0.75
0.2586 46.0 58 0.4282 0.8333 0.8333 0.75
0.2586 47.0 59 0.4202 0.8333 0.8333 0.75
0.2586 48.0 60 0.3871 0.875 0.875 0.8333
0.2586 49.0 61 0.3560 0.875 0.875 0.8333
0.2586 50.0 62 0.3330 0.8571 0.8542 0.75
0.2586 51.0 64 0.3034 0.8980 0.8958 0.75
0.2586 52.0 65 0.3170 0.8980 0.8958 0.75
0.2586 53.0 66 0.3288 0.8980 0.8958 0.75
0.2586 54.0 68 0.3157 0.9388 0.9375 0.8333
0.2586 55.0 69 0.3490 0.8571 0.8542 0.6667
0.2586 56.0 70 0.3491 0.8571 0.8542 0.6667
0.2586 57.0 71 0.3429 0.8980 0.8958 0.75
0.2586 58.0 72 0.3620 0.8571 0.8542 0.6667
0.2586 59.0 74 0.4072 0.8571 0.8542 0.6667
0.2586 60.0 75 0.4153 0.8333 0.8333 0.5833
0.2586 61.0 76 0.4254 0.8333 0.8333 0.5833
0.2586 62.0 78 0.4320 0.8163 0.8125 0.5833
0.2586 63.0 79 0.4318 0.8163 0.8125 0.5833
0.2586 64.0 80 0.4116 0.8163 0.8125 0.5833
0.2586 65.0 81 0.3835 0.8571 0.8542 0.6667
0.2586 66.0 82 0.3554 0.8571 0.8542 0.6667
0.2586 67.0 84 0.3407 0.875 0.875 0.75
0.2586 68.0 85 0.3252 0.875 0.875 0.75
0.2586 69.0 86 0.3069 0.875 0.875 0.75
0.2586 70.0 88 0.2970 0.875 0.875 0.75
0.2586 71.0 89 0.2934 0.8571 0.8542 0.6667
0.2586 72.0 90 0.2999 0.8571 0.8542 0.6667
0.2586 73.0 91 0.3068 0.8571 0.8542 0.6667
0.2586 74.0 92 0.3181 0.8571 0.8542 0.6667
0.2586 75.0 94 0.3391 0.8571 0.8542 0.6667
0.2586 76.0 95 0.3482 0.8571 0.8542 0.6667
0.2586 77.0 96 0.3577 0.8571 0.8542 0.6667
0.2586 78.0 98 0.4279 0.8163 0.8125 0.5833
0.2586 79.0 99 0.4492 0.7347 0.7292 0.5
0.2586 80.0 100 0.4291 0.7755 0.7708 0.5833
0.2586 81.0 101 0.4267 0.7755 0.7708 0.5833
0.2586 82.0 102 0.4160 0.7755 0.7708 0.5833
0.2586 83.0 104 0.4000 0.8333 0.8333 0.5833
0.2586 84.0 105 0.3792 0.8571 0.8542 0.6667
0.2586 85.0 106 0.3368 0.8571 0.8542 0.6667
0.2586 86.0 108 0.3480 0.8571 0.8542 0.6667
0.2586 87.0 109 0.3798 0.8333 0.8333 0.5833
0.2586 88.0 110 0.3806 0.8163 0.8125 0.5833
0.2586 89.0 111 0.3533 0.8571 0.8542 0.6667
0.2586 90.0 112 0.3428 0.8571 0.8542 0.6667
0.2586 91.0 114 0.3447 0.875 0.875 0.6667
0.2586 92.0 115 0.3440 0.8571 0.8542 0.6667
0.2586 93.0 116 0.3433 0.8571 0.8542 0.6667
0.2586 94.0 118 0.3584 0.8571 0.8542 0.6667
0.2586 95.0 119 0.3502 0.8571 0.8542 0.6667
0.2586 96.0 120 0.3413 0.8571 0.8542 0.6667
0.2586 97.0 121 0.3247 0.8571 0.8542 0.6667
0.2586 98.0 122 0.3232 0.8571 0.8542 0.6667
0.2586 99.0 124 0.3178 0.8571 0.8542 0.6667
0.2586 100.0 125 0.3201 0.8571 0.8542 0.6667
0.2586 101.0 126 0.3100 0.8333 0.8333 0.6667
0.2586 102.0 128 0.3137 0.8511 0.8542 0.6667
0.2586 103.0 129 0.3140 0.8511 0.8542 0.6667
0.2586 104.0 130 0.3332 0.8333 0.8333 0.5833
0.2586 105.0 131 0.3598 0.7917 0.7917 0.5833
0.2586 106.0 132 0.3742 0.8511 0.8542 0.5833
0.2586 107.0 134 0.3924 0.8571 0.8542 0.6667
0.2586 108.0 135 0.4015 0.8333 0.8333 0.6667
0.2586 109.0 136 0.4096 0.8333 0.8333 0.6667
0.2586 110.0 138 0.4227 0.7917 0.7917 0.5833
0.2586 111.0 139 0.4343 0.7917 0.7917 0.5833
0.2586 112.0 140 0.4349 0.7917 0.7917 0.5833
0.2586 113.0 141 0.4187 0.7917 0.7917 0.5833
0.2586 114.0 142 0.4037 0.7917 0.7917 0.5833
0.2586 115.0 144 0.3776 0.8571 0.8542 0.6667
0.2586 116.0 145 0.3763 0.8571 0.8542 0.6667
0.2586 117.0 146 0.3666 0.8571 0.8542 0.6667
0.2586 118.0 148 0.3539 0.8571 0.8542 0.6667
0.2586 119.0 149 0.3497 0.8571 0.8542 0.6667
0.2586 120.0 150 0.3375 0.8571 0.8542 0.6667
0.2586 121.0 151 0.3265 0.8571 0.8542 0.6667
0.2586 122.0 152 0.3154 0.8571 0.8542 0.6667
0.2586 123.0 154 0.3044 0.8571 0.8542 0.6667
0.2586 124.0 155 0.3225 0.8333 0.8333 0.6667
0.2586 125.0 156 0.3338 0.8333 0.8333 0.6667
0.2586 126.0 158 0.3363 0.8571 0.8542 0.6667
0.2586 127.0 159 0.3446 0.8571 0.8542 0.6667
0.2586 128.0 160 0.3458 0.8333 0.8333 0.6667
0.2586 129.0 161 0.3591 0.8571 0.8542 0.6667
0.2586 130.0 162 0.3573 0.8333 0.8333 0.6667
0.2586 131.0 164 0.3623 0.8571 0.8542 0.6667
0.2586 132.0 165 0.3636 0.8571 0.8542 0.6667
0.2586 133.0 166 0.3622 0.8571 0.8542 0.6667
0.2586 134.0 168 0.3569 0.8571 0.8542 0.6667
0.2586 135.0 169 0.3532 0.8571 0.8542 0.6667
0.2586 136.0 170 0.3576 0.8571 0.8542 0.6667
0.2586 137.0 171 0.3548 0.8571 0.8542 0.6667
0.2586 138.0 172 0.3503 0.8571 0.8542 0.6667
0.2586 139.0 174 0.3547 0.8571 0.8542 0.6667
0.2586 140.0 175 0.3484 0.8571 0.8542 0.6667
0.2586 141.0 176 0.3491 0.8571 0.8542 0.6667
0.2586 142.0 178 0.3511 0.8571 0.8542 0.6667
0.2586 143.0 179 0.3620 0.8571 0.8542 0.6667
0.2586 144.0 180 0.3616 0.8333 0.8333 0.6667
0.2586 145.0 181 0.3654 0.8333 0.8333 0.6667
0.2586 146.0 182 0.3541 0.8571 0.8542 0.6667
0.2586 147.0 184 0.3533 0.8333 0.8333 0.6667
0.2586 148.0 185 0.3675 0.8333 0.8333 0.6667
0.2586 149.0 186 0.3616 0.8333 0.8333 0.6667
0.2586 150.0 188 0.3747 0.8333 0.8333 0.6667
0.2586 151.0 189 0.3888 0.7917 0.7917 0.5833
0.2586 152.0 190 0.3884 0.8085 0.8125 0.5833
0.2586 153.0 191 0.3759 0.8333 0.8333 0.6667
0.2586 154.0 192 0.3743 0.8333 0.8333 0.6667
0.2586 155.0 194 0.3895 0.7917 0.7917 0.5833
0.2586 156.0 195 0.3965 0.7917 0.7917 0.5833
0.2586 157.0 196 0.3917 0.7917 0.7917 0.5833
0.2586 158.0 198 0.3845 0.7917 0.7917 0.5833
0.2586 159.0 199 0.3597 0.8333 0.8333 0.6667
0.2586 160.0 200 0.3580 0.8333 0.8333 0.6667

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
19
Safetensors
Model size
4.76M params
Tensor type
F32
·

Finetuned from

Evaluation results