Edit model card

resnet101-base_tobacco-cnn_tobacco3482_kd

This model is a fine-tuned version of bdpc/resnet101-base_tobacco on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8158
  • Accuracy: 0.565
  • Brier Loss: 0.6104
  • Nll: 2.6027
  • F1 Micro: 0.565
  • F1 Macro: 0.4783
  • Ece: 0.2677
  • Aurc: 0.2516

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 4 1.4988 0.065 0.9007 9.6504 0.065 0.0267 0.1512 0.9377
No log 2.0 8 1.4615 0.155 0.8961 7.9200 0.155 0.0268 0.2328 0.9605
No log 3.0 12 1.4913 0.155 0.9531 11.6402 0.155 0.0268 0.3390 0.8899
No log 4.0 16 2.2747 0.155 1.4111 11.0294 0.155 0.0268 0.7077 0.7068
No log 5.0 20 2.4543 0.155 1.4359 8.3074 0.155 0.0268 0.7226 0.6151
No log 6.0 24 1.9614 0.155 1.1785 6.6431 0.155 0.0283 0.5497 0.6022
No log 7.0 28 1.6280 0.18 0.9978 5.8468 0.18 0.0488 0.4014 0.6135
No log 8.0 32 1.3465 0.225 0.8993 5.6177 0.225 0.0740 0.3378 0.5786
No log 9.0 36 1.2597 0.225 0.8794 5.0542 0.225 0.0727 0.3403 0.5658
No log 10.0 40 1.1149 0.27 0.8181 4.5188 0.27 0.1222 0.2890 0.5230
No log 11.0 44 0.9805 0.31 0.7600 3.8687 0.31 0.1726 0.2703 0.4690
No log 12.0 48 1.0099 0.335 0.7732 3.6652 0.335 0.2095 0.2892 0.4739
No log 13.0 52 1.0522 0.335 0.7919 3.3843 0.335 0.2562 0.3006 0.6402
No log 14.0 56 1.0566 0.32 0.7868 3.4244 0.32 0.2373 0.3023 0.6094
No log 15.0 60 0.9670 0.405 0.7333 3.3926 0.405 0.3189 0.3013 0.4037
No log 16.0 64 1.0979 0.31 0.7877 3.3045 0.31 0.2262 0.2792 0.5720
No log 17.0 68 0.9022 0.44 0.6913 3.2277 0.44 0.3429 0.2902 0.3657
No log 18.0 72 1.2120 0.315 0.8075 4.1289 0.315 0.2323 0.2857 0.5909
No log 19.0 76 1.1945 0.39 0.7974 4.2350 0.39 0.3292 0.3271 0.5989
No log 20.0 80 1.3861 0.345 0.7981 5.2605 0.345 0.2700 0.2832 0.5299
No log 21.0 84 1.2243 0.33 0.8073 4.5262 0.33 0.2545 0.3068 0.6133
No log 22.0 88 1.0455 0.38 0.7238 2.7133 0.38 0.3084 0.2901 0.4855
No log 23.0 92 0.9044 0.45 0.6814 3.4361 0.45 0.3273 0.2927 0.3246
No log 24.0 96 0.8930 0.495 0.6596 3.3412 0.495 0.4185 0.2882 0.3070
No log 25.0 100 0.8665 0.485 0.6534 2.9998 0.485 0.4154 0.2641 0.3298
No log 26.0 104 1.0458 0.375 0.7579 3.1074 0.375 0.3333 0.2735 0.5293
No log 27.0 108 1.0170 0.41 0.7321 2.8884 0.41 0.3468 0.2976 0.4566
No log 28.0 112 1.0956 0.395 0.7464 3.3094 0.395 0.3255 0.3154 0.4684
No log 29.0 116 1.0805 0.39 0.7544 3.2115 0.39 0.3193 0.3014 0.4594
No log 30.0 120 1.2358 0.375 0.7733 4.3992 0.375 0.3058 0.2845 0.4876
No log 31.0 124 1.0532 0.4 0.7458 2.7398 0.4000 0.3614 0.2890 0.4961
No log 32.0 128 1.0166 0.365 0.7355 2.5093 0.3650 0.2862 0.2728 0.5057
No log 33.0 132 0.9395 0.48 0.6807 2.6211 0.48 0.4394 0.2843 0.3719
No log 34.0 136 0.8718 0.52 0.6538 2.6802 0.52 0.4697 0.2954 0.3051
No log 35.0 140 0.8339 0.51 0.6362 3.1084 0.51 0.4373 0.2654 0.3006
No log 36.0 144 0.8411 0.51 0.6359 2.7881 0.51 0.4286 0.2759 0.2906
No log 37.0 148 0.8556 0.505 0.6402 2.5519 0.505 0.4076 0.2522 0.3060
No log 38.0 152 1.0928 0.395 0.7438 2.8660 0.395 0.3337 0.2815 0.4724
No log 39.0 156 1.3830 0.39 0.8135 4.7392 0.39 0.3094 0.2879 0.5239
No log 40.0 160 1.2180 0.38 0.7760 3.8384 0.38 0.3106 0.2614 0.5109
No log 41.0 164 1.1337 0.365 0.7486 2.8843 0.3650 0.2948 0.2665 0.4630
No log 42.0 168 0.8814 0.53 0.6425 2.3353 0.53 0.4645 0.2968 0.2973
No log 43.0 172 0.8324 0.515 0.6174 2.4407 0.515 0.4517 0.2847 0.2742
No log 44.0 176 0.8477 0.53 0.6282 2.5469 0.53 0.4615 0.2712 0.2831
No log 45.0 180 0.8307 0.515 0.6190 2.4871 0.515 0.4404 0.2594 0.2845
No log 46.0 184 0.8116 0.53 0.6070 2.4944 0.53 0.4410 0.2337 0.2451
No log 47.0 188 0.8349 0.54 0.6260 2.2843 0.54 0.4423 0.2911 0.2616
No log 48.0 192 0.8298 0.555 0.6178 2.2946 0.555 0.4725 0.2568 0.2482
No log 49.0 196 0.8252 0.565 0.6141 2.3311 0.565 0.4762 0.2810 0.2504
No log 50.0 200 0.8158 0.565 0.6104 2.6027 0.565 0.4783 0.2677 0.2516

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
10
Safetensors
Model size
42.6M params
Tensor type
F32
·
Inference API
Drag image file here or click to browse from your device
This model can be loaded on Inference API (serverless).

Finetuned from