metadata
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: plant-seedlings-model-ResNet18-freeze-0-12-20ep
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9327111984282908
plant-seedlings-model-ResNet18-freeze-0-12-20ep
This model is a fine-tuned version of microsoft/resnet-18 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.2060
- Accuracy: 0.9327
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.4528 | 0.25 | 128 | 0.6686 | 0.7878 |
0.4291 | 0.5 | 256 | 0.6259 | 0.7903 |
0.4961 | 0.75 | 384 | 0.5677 | 0.8055 |
0.4637 | 1.01 | 512 | 0.5073 | 0.8330 |
0.6897 | 1.26 | 640 | 0.5817 | 0.8060 |
0.5257 | 1.51 | 768 | 0.5118 | 0.8276 |
0.5381 | 1.76 | 896 | 0.4809 | 0.8384 |
0.4736 | 2.01 | 1024 | 0.3976 | 0.8595 |
0.4967 | 2.26 | 1152 | 0.4192 | 0.8566 |
0.4505 | 2.51 | 1280 | 0.4128 | 0.8590 |
0.4211 | 2.77 | 1408 | 0.4075 | 0.8576 |
0.3877 | 3.02 | 1536 | 0.3796 | 0.8738 |
0.3134 | 3.27 | 1664 | 0.3906 | 0.8762 |
0.3596 | 3.52 | 1792 | 0.3703 | 0.8846 |
0.3859 | 3.77 | 1920 | 0.3125 | 0.8954 |
0.4076 | 4.02 | 2048 | 0.3718 | 0.8615 |
0.3109 | 4.28 | 2176 | 0.3449 | 0.8924 |
0.4588 | 4.53 | 2304 | 0.3377 | 0.8875 |
0.2923 | 4.78 | 2432 | 0.3001 | 0.8998 |
0.3273 | 5.03 | 2560 | 0.3187 | 0.8880 |
0.2541 | 5.28 | 2688 | 0.3432 | 0.8856 |
0.3059 | 5.53 | 2816 | 0.3236 | 0.8988 |
0.2979 | 5.78 | 2944 | 0.3532 | 0.8851 |
0.2748 | 6.04 | 3072 | 0.3407 | 0.8885 |
0.3537 | 6.29 | 3200 | 0.2925 | 0.8988 |
0.3364 | 6.54 | 3328 | 0.3071 | 0.9047 |
0.2135 | 6.79 | 3456 | 0.2765 | 0.9077 |
0.2023 | 7.04 | 3584 | 0.2919 | 0.9037 |
0.1977 | 7.29 | 3712 | 0.2812 | 0.8978 |
0.4042 | 7.54 | 3840 | 0.2954 | 0.8998 |
0.3662 | 7.8 | 3968 | 0.2857 | 0.9018 |
0.1872 | 8.05 | 4096 | 0.2504 | 0.9140 |
0.3959 | 8.3 | 4224 | 0.2984 | 0.8993 |
0.2403 | 8.55 | 4352 | 0.2847 | 0.8998 |
0.3689 | 8.8 | 4480 | 0.2872 | 0.9023 |
0.2819 | 9.05 | 4608 | 0.3104 | 0.9008 |
0.1926 | 9.3 | 4736 | 0.2871 | 0.8969 |
0.2371 | 9.56 | 4864 | 0.2733 | 0.9082 |
0.2566 | 9.81 | 4992 | 0.2816 | 0.9101 |
0.2174 | 10.06 | 5120 | 0.2719 | 0.9160 |
0.2359 | 10.31 | 5248 | 0.2497 | 0.9175 |
0.2986 | 10.56 | 5376 | 0.2847 | 0.9096 |
0.2239 | 10.81 | 5504 | 0.2493 | 0.9180 |
0.2132 | 11.06 | 5632 | 0.2567 | 0.9121 |
0.1934 | 11.32 | 5760 | 0.2722 | 0.9028 |
0.2026 | 11.57 | 5888 | 0.2456 | 0.9229 |
0.2457 | 11.82 | 6016 | 0.2483 | 0.9234 |
0.2537 | 12.07 | 6144 | 0.2409 | 0.9165 |
0.193 | 12.32 | 6272 | 0.2215 | 0.9239 |
0.1738 | 12.57 | 6400 | 0.2421 | 0.9165 |
0.2925 | 12.83 | 6528 | 0.2499 | 0.9150 |
0.1173 | 13.08 | 6656 | 0.2174 | 0.9258 |
0.2147 | 13.33 | 6784 | 0.2917 | 0.9131 |
0.1581 | 13.58 | 6912 | 0.2734 | 0.9175 |
0.1349 | 13.83 | 7040 | 0.2485 | 0.9165 |
0.1212 | 14.08 | 7168 | 0.2247 | 0.9268 |
0.2178 | 14.33 | 7296 | 0.2289 | 0.9268 |
0.0879 | 14.59 | 7424 | 0.2512 | 0.9219 |
0.2006 | 14.84 | 7552 | 0.2321 | 0.9293 |
0.2308 | 15.09 | 7680 | 0.2491 | 0.9263 |
0.2137 | 15.34 | 7808 | 0.2270 | 0.9312 |
0.1112 | 15.59 | 7936 | 0.2205 | 0.9249 |
0.1477 | 15.84 | 8064 | 0.2328 | 0.9307 |
0.1794 | 16.09 | 8192 | 0.2051 | 0.9332 |
0.0596 | 16.35 | 8320 | 0.2234 | 0.9347 |
0.0533 | 16.6 | 8448 | 0.2469 | 0.9293 |
0.1096 | 16.85 | 8576 | 0.1871 | 0.9401 |
0.1117 | 17.1 | 8704 | 0.2302 | 0.9249 |
0.1349 | 17.35 | 8832 | 0.2084 | 0.9391 |
0.1031 | 17.6 | 8960 | 0.2200 | 0.9283 |
0.2428 | 17.85 | 9088 | 0.2201 | 0.9298 |
0.1283 | 18.11 | 9216 | 0.2293 | 0.9273 |
0.1688 | 18.36 | 9344 | 0.2120 | 0.9307 |
0.0877 | 18.61 | 9472 | 0.2200 | 0.9229 |
0.1508 | 18.86 | 9600 | 0.2204 | 0.9327 |
0.0868 | 19.11 | 9728 | 0.2224 | 0.9293 |
0.211 | 19.36 | 9856 | 0.1988 | 0.9401 |
0.1059 | 19.61 | 9984 | 0.2082 | 0.9322 |
0.182 | 19.87 | 10112 | 0.2060 | 0.9327 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3