Edit model card

car_identified_model_7

This model is a fine-tuned version of apple/mobilevitv2-1.0-imagenet1k-256 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5755
  • F1: 0.3629
  • Roc Auc: 0.6990
  • Accuracy: 0.0714

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.6919 0.73 1 0.6887 0.1786 0.5738 0.0
0.6919 1.45 2 0.6856 0.1818 0.5761 0.0
0.6919 2.91 4 0.6802 0.2116 0.6066 0.0
0.6919 3.64 5 0.6800 0.1861 0.5826 0.0
0.6919 4.36 6 0.6858 0.1905 0.5973 0.0
0.6919 5.82 8 0.6938 0.1549 0.5342 0.0
0.6919 6.55 9 0.6917 0.1805 0.5802 0.0
0.6919 8.0 11 0.6735 0.1905 0.5932 0.0
0.6919 8.73 12 0.6727 0.1952 0.6007 0.0
0.6919 9.45 13 0.6698 0.2061 0.6172 0.0
0.6919 10.91 15 0.6672 0.2008 0.6092 0.0
0.6919 11.64 16 0.6645 0.2092 0.6196 0.0
0.6919 12.36 17 0.6646 0.2049 0.6144 0.0
0.6919 13.82 19 0.6623 0.2081 0.6167 0.0
0.6919 14.55 20 0.6607 0.2078 0.6149 0.0
0.6919 16.0 22 0.6585 0.2203 0.6320 0.0
0.6919 16.73 23 0.6562 0.2156 0.6219 0.0
0.6919 17.45 24 0.6555 0.2182 0.6263 0.0
0.6919 18.91 26 0.6522 0.2185 0.6232 0.0
0.6919 19.64 27 0.6512 0.2228 0.6273 0.0
0.6919 20.36 28 0.6501 0.2356 0.6410 0.0
0.6919 21.82 30 0.6477 0.2280 0.6284 0.0
0.6919 22.55 31 0.6476 0.2326 0.6343 0.0
0.6919 24.0 33 0.6469 0.2408 0.6434 0.0
0.6919 24.73 34 0.6432 0.2409 0.6369 0.0
0.6919 25.45 35 0.6432 0.2431 0.6408 0.0
0.6919 26.91 37 0.6402 0.2486 0.6449 0.0
0.6919 27.64 38 0.6386 0.2686 0.6664 0.0
0.6919 28.36 39 0.6376 0.2762 0.6796 0.0
0.6919 29.82 41 0.6347 0.2692 0.6721 0.0
0.6919 30.55 42 0.6339 0.2655 0.6643 0.0
0.6919 32.0 44 0.6310 0.2674 0.6630 0.0
0.6919 32.73 45 0.6307 0.2789 0.6731 0.0
0.6919 33.45 46 0.6291 0.2714 0.6656 0.0
0.6919 34.91 48 0.6271 0.2761 0.6659 0.0
0.6919 35.64 49 0.6271 0.2687 0.6612 0.0
0.6919 36.36 50 0.6277 0.2606 0.6509 0.0
0.6919 37.82 52 0.6257 0.2741 0.6620 0.0
0.6919 38.55 53 0.6244 0.2892 0.6793 0.0
0.6919 40.0 55 0.6203 0.2968 0.6806 0.0
0.6919 40.73 56 0.6198 0.2902 0.6770 0.0
0.6919 41.45 57 0.6184 0.3023 0.6866 0.0
0.6919 42.91 59 0.6163 0.2977 0.6812 0.0
0.6919 43.64 60 0.6147 0.3322 0.7112 0.0
0.6919 44.36 61 0.6154 0.3197 0.6954 0.0
0.6919 45.82 63 0.6129 0.3016 0.6832 0.0
0.6919 46.55 64 0.6112 0.3020 0.6804 0.0
0.6919 48.0 66 0.6095 0.2961 0.6773 0.0
0.6919 48.73 67 0.6091 0.3133 0.6923 0.0
0.6919 49.45 68 0.6090 0.3265 0.7019 0.0
0.6919 50.91 70 0.6077 0.3093 0.6840 0.0
0.6919 51.64 71 0.6065 0.3239 0.6941 0.0
0.6919 52.36 72 0.6058 0.3237 0.6907 0.0
0.6919 53.82 74 0.6028 0.3285 0.6928 0.0
0.6919 54.55 75 0.6038 0.3285 0.6928 0.0238
0.6919 56.0 77 0.6056 0.3197 0.6825 0.0
0.6919 56.73 78 0.6074 0.3249 0.6913 0.0
0.6919 57.45 79 0.6030 0.3158 0.6775 0.0238
0.6919 58.91 81 0.6001 0.3359 0.6925 0.0238
0.6919 59.64 82 0.5993 0.3409 0.6980 0.0238
0.6919 60.36 83 0.6017 0.3259 0.6884 0.0238
0.6919 61.82 85 0.6009 0.3146 0.6770 0.0238
0.6919 62.55 86 0.6018 0.3197 0.6825 0.0238
0.6919 64.0 88 0.5975 0.3130 0.6731 0.0238
0.6919 64.73 89 0.5978 0.3271 0.6889 0.0238
0.6919 65.45 90 0.5967 0.3424 0.6951 0.0238
0.6919 66.91 92 0.5973 0.3125 0.6698 0.0238
0.6919 67.64 93 0.5956 0.3372 0.6931 0.0238
0.6919 68.36 94 0.5922 0.3373 0.6897 0.0238
0.6919 69.82 96 0.5949 0.3320 0.6843 0.0476
0.6919 70.55 97 0.5959 0.3413 0.6913 0.0476
0.6919 72.0 99 0.5944 0.3420 0.7019 0.0238
0.6919 72.73 100 0.5955 0.3333 0.6881 0.0476
0.6919 73.45 101 0.5933 0.3346 0.6887 0.0238
0.6919 74.91 103 0.5894 0.3543 0.7032 0.0238
0.6919 75.64 104 0.5903 0.3424 0.6951 0.0238
0.6919 76.36 105 0.5890 0.3411 0.6946 0.0476
0.6919 77.82 107 0.5922 0.3346 0.6887 0.0476
0.6919 78.55 108 0.5923 0.3243 0.6812 0.0476
0.6919 80.0 110 0.5908 0.3468 0.6933 0.0476
0.6919 80.73 111 0.5922 0.328 0.6793 0.0476
0.6919 81.45 112 0.5892 0.3440 0.6923 0.0238
0.6919 82.91 114 0.5880 0.3506 0.6982 0.0238
0.6919 83.64 115 0.5869 0.3454 0.6928 0.0476
0.6919 84.36 116 0.5841 0.3465 0.6967 0.0238
0.6919 85.82 118 0.5841 0.3568 0.6969 0.0714
0.6919 86.55 119 0.5843 0.3496 0.6944 0.0476
0.6919 88.0 121 0.5860 0.3598 0.6980 0.0476
0.6919 88.73 122 0.5837 0.3457 0.6894 0.0476
0.6919 89.45 123 0.5826 0.3636 0.7029 0.0714
0.6919 90.91 125 0.5822 0.3651 0.7034 0.0714
0.6919 91.64 126 0.5814 0.3607 0.7019 0.0714
0.6919 92.36 127 0.5814 0.3629 0.7063 0.0476
0.6919 93.82 129 0.5818 0.3713 0.7055 0.0714
0.6919 94.55 130 0.5802 0.3766 0.7109 0.0714
0.6919 96.0 132 0.5803 0.3675 0.7006 0.0714
0.6919 96.73 133 0.5825 0.3519 0.6881 0.0714
0.6919 97.45 134 0.5790 0.3629 0.6990 0.0714
0.6919 98.91 136 0.5795 0.3766 0.7109 0.0714
0.6919 99.64 137 0.5784 0.3697 0.7050 0.0714
0.6919 100.36 138 0.5819 0.3583 0.6975 0.0714
0.6919 101.82 140 0.5834 0.3525 0.6954 0.0476
0.6919 102.55 141 0.5825 0.3689 0.7083 0.0238
0.6919 104.0 143 0.5839 0.3460 0.6861 0.0714
0.6919 104.73 144 0.5838 0.3333 0.6814 0.0476
0.6919 105.45 145 0.5801 0.3387 0.6869 0.0238
0.6919 106.91 147 0.5811 0.3515 0.6915 0.0476
0.6919 107.64 148 0.5793 0.3374 0.6830 0.0476
0.6919 108.36 149 0.5766 0.3448 0.6822 0.0714
0.6919 109.82 151 0.5760 0.3445 0.6856 0.0714
0.6919 110.55 152 0.5757 0.3559 0.6931 0.0714
0.6919 112.0 154 0.5760 0.3475 0.6866 0.0714
0.6919 112.73 155 0.5743 0.3629 0.6990 0.0714
0.6919 113.45 156 0.5732 0.3636 0.7029 0.0714
0.6919 114.91 158 0.5736 0.3786 0.7153 0.0476
0.6919 115.64 159 0.5764 0.3667 0.7039 0.0238
0.6919 116.36 160 0.5765 0.3613 0.6985 0.0476
0.6919 117.82 162 0.5749 0.3574 0.6936 0.0714
0.6919 118.55 163 0.5754 0.3592 0.7013 0.0476
0.6919 120.0 165 0.5757 0.3665 0.7112 0.0476
0.6919 120.73 166 0.5771 0.3729 0.7060 0.0714
0.6919 121.45 167 0.5746 0.3629 0.6990 0.0714
0.6919 122.91 169 0.5758 0.3644 0.6995 0.0714
0.6919 123.64 170 0.5745 0.3559 0.6931 0.0714
0.6919 124.36 171 0.5758 0.3544 0.6925 0.0714
0.6919 125.82 173 0.5759 0.3598 0.6980 0.0714
0.6919 126.55 174 0.5772 0.3568 0.6969 0.0714
0.6919 128.0 176 0.5747 0.3583 0.6975 0.0714
0.6919 128.73 177 0.5738 0.3644 0.6995 0.0714
0.6919 129.45 178 0.5751 0.3644 0.6995 0.0714
0.6919 130.91 180 0.5741 0.3713 0.7055 0.0952
0.6919 131.64 181 0.5748 0.3713 0.7055 0.0952
0.6919 132.36 182 0.5767 0.3660 0.7001 0.0714
0.6919 133.82 184 0.5732 0.3660 0.7001 0.0952
0.6919 134.55 185 0.5742 0.3772 0.7037 0.0952
0.6919 136.0 187 0.5690 0.3755 0.7032 0.0952
0.6919 136.73 188 0.5699 0.3805 0.7047 0.0714
0.6919 137.45 189 0.5743 0.3707 0.7016 0.0714
0.6919 138.91 191 0.5740 0.3529 0.6920 0.0952
0.6919 139.64 192 0.5740 0.3660 0.7001 0.0714
0.6919 140.36 193 0.5734 0.3644 0.6995 0.0714
0.6919 141.82 195 0.5740 0.3675 0.7006 0.0714
0.6919 142.55 196 0.5721 0.3707 0.7016 0.0714
0.6919 144.0 198 0.5725 0.3767 0.6998 0.0714
0.6919 144.73 199 0.5734 0.3729 0.7060 0.0952
0.6919 145.45 200 0.5755 0.3629 0.6990 0.0714

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
18
Safetensors
Model size
4.76M params
Tensor type
F32
·

Finetuned from

Evaluation results