|
--- |
|
license: apache-2.0 |
|
base_model: facebook/convnextv2-tiny-1k-224 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- imagefolder |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: convnextv2-tiny-1k-224-finetuned-eurosat |
|
results: |
|
- task: |
|
name: Image Classification |
|
type: image-classification |
|
dataset: |
|
name: imagefolder |
|
type: imagefolder |
|
config: default |
|
split: train |
|
args: default |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.8288288288288288 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# convnextv2-tiny-1k-224-finetuned-eurosat |
|
|
|
This model is a fine-tuned version of [facebook/convnextv2-tiny-1k-224](https://huggingface.co/facebook/convnextv2-tiny-1k-224) on the imagefolder dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.7923 |
|
- Accuracy: 0.8288 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 128 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 150 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-------:|:----:|:---------------:|:--------:| |
|
| 3.5919 | 0.992 | 31 | 3.5736 | 0.0315 | |
|
| 3.5392 | 1.984 | 62 | 3.4990 | 0.0473 | |
|
| 3.4591 | 2.976 | 93 | 3.3992 | 0.1104 | |
|
| 3.3361 | 4.0 | 125 | 3.2526 | 0.2523 | |
|
| 3.2066 | 4.992 | 156 | 3.0851 | 0.3761 | |
|
| 3.0024 | 5.984 | 187 | 2.8376 | 0.4437 | |
|
| 2.8094 | 6.976 | 218 | 2.6320 | 0.4910 | |
|
| 2.509 | 8.0 | 250 | 2.3765 | 0.5360 | |
|
| 2.2526 | 8.992 | 281 | 2.0400 | 0.5923 | |
|
| 1.9442 | 9.984 | 312 | 1.7940 | 0.6396 | |
|
| 1.7672 | 10.9760 | 343 | 1.5892 | 0.6824 | |
|
| 1.5273 | 12.0 | 375 | 1.3500 | 0.7185 | |
|
| 1.3854 | 12.992 | 406 | 1.2243 | 0.7162 | |
|
| 1.197 | 13.984 | 437 | 1.1022 | 0.7387 | |
|
| 1.1114 | 14.9760 | 468 | 1.0138 | 0.7613 | |
|
| 0.9364 | 16.0 | 500 | 0.9164 | 0.7748 | |
|
| 0.8755 | 16.992 | 531 | 0.9058 | 0.7523 | |
|
| 0.7473 | 17.984 | 562 | 0.8045 | 0.7928 | |
|
| 0.7189 | 18.976 | 593 | 0.7735 | 0.7883 | |
|
| 0.6461 | 20.0 | 625 | 0.6876 | 0.8198 | |
|
| 0.6041 | 20.992 | 656 | 0.7212 | 0.7973 | |
|
| 0.5016 | 21.984 | 687 | 0.6611 | 0.8198 | |
|
| 0.4996 | 22.976 | 718 | 0.6110 | 0.8153 | |
|
| 0.4825 | 24.0 | 750 | 0.6476 | 0.8063 | |
|
| 0.434 | 24.992 | 781 | 0.6793 | 0.8041 | |
|
| 0.4296 | 25.984 | 812 | 0.6015 | 0.8018 | |
|
| 0.36 | 26.976 | 843 | 0.6615 | 0.8063 | |
|
| 0.3646 | 28.0 | 875 | 0.6059 | 0.8221 | |
|
| 0.3542 | 28.992 | 906 | 0.6973 | 0.7928 | |
|
| 0.3091 | 29.984 | 937 | 0.6400 | 0.8266 | |
|
| 0.2774 | 30.976 | 968 | 0.5798 | 0.8266 | |
|
| 0.3166 | 32.0 | 1000 | 0.6134 | 0.8333 | |
|
| 0.2878 | 32.992 | 1031 | 0.6353 | 0.8063 | |
|
| 0.2529 | 33.984 | 1062 | 0.6628 | 0.8243 | |
|
| 0.2601 | 34.976 | 1093 | 0.6367 | 0.8041 | |
|
| 0.2208 | 36.0 | 1125 | 0.6313 | 0.8288 | |
|
| 0.2342 | 36.992 | 1156 | 0.5969 | 0.8378 | |
|
| 0.2122 | 37.984 | 1187 | 0.6391 | 0.8198 | |
|
| 0.1791 | 38.976 | 1218 | 0.6771 | 0.8108 | |
|
| 0.2113 | 40.0 | 1250 | 0.7035 | 0.8086 | |
|
| 0.1703 | 40.992 | 1281 | 0.7096 | 0.8153 | |
|
| 0.1751 | 41.984 | 1312 | 0.5964 | 0.8446 | |
|
| 0.1889 | 42.976 | 1343 | 0.6607 | 0.8446 | |
|
| 0.1791 | 44.0 | 1375 | 0.7000 | 0.8243 | |
|
| 0.1372 | 44.992 | 1406 | 0.6866 | 0.8243 | |
|
| 0.1785 | 45.984 | 1437 | 0.6621 | 0.8266 | |
|
| 0.1469 | 46.976 | 1468 | 0.6391 | 0.8266 | |
|
| 0.1628 | 48.0 | 1500 | 0.6623 | 0.8356 | |
|
| 0.1425 | 48.992 | 1531 | 0.6443 | 0.8288 | |
|
| 0.1727 | 49.984 | 1562 | 0.6361 | 0.8446 | |
|
| 0.1442 | 50.976 | 1593 | 0.6397 | 0.8491 | |
|
| 0.1386 | 52.0 | 1625 | 0.6835 | 0.8423 | |
|
| 0.1564 | 52.992 | 1656 | 0.7072 | 0.8266 | |
|
| 0.1151 | 53.984 | 1687 | 0.6835 | 0.8311 | |
|
| 0.1446 | 54.976 | 1718 | 0.7347 | 0.8198 | |
|
| 0.1353 | 56.0 | 1750 | 0.6935 | 0.8401 | |
|
| 0.13 | 56.992 | 1781 | 0.7337 | 0.8198 | |
|
| 0.1312 | 57.984 | 1812 | 0.6625 | 0.8311 | |
|
| 0.1201 | 58.976 | 1843 | 0.6956 | 0.8243 | |
|
| 0.1411 | 60.0 | 1875 | 0.7290 | 0.8243 | |
|
| 0.1116 | 60.992 | 1906 | 0.7052 | 0.8356 | |
|
| 0.1251 | 61.984 | 1937 | 0.6915 | 0.8311 | |
|
| 0.1101 | 62.976 | 1968 | 0.6457 | 0.8378 | |
|
| 0.0883 | 64.0 | 2000 | 0.6553 | 0.8378 | |
|
| 0.1225 | 64.992 | 2031 | 0.6454 | 0.8401 | |
|
| 0.1135 | 65.984 | 2062 | 0.6616 | 0.8514 | |
|
| 0.1009 | 66.976 | 2093 | 0.6375 | 0.8536 | |
|
| 0.1027 | 68.0 | 2125 | 0.6754 | 0.8266 | |
|
| 0.0925 | 68.992 | 2156 | 0.7497 | 0.8176 | |
|
| 0.0878 | 69.984 | 2187 | 0.6573 | 0.8491 | |
|
| 0.1093 | 70.976 | 2218 | 0.7015 | 0.8356 | |
|
| 0.1024 | 72.0 | 2250 | 0.6907 | 0.8446 | |
|
| 0.0934 | 72.992 | 2281 | 0.7059 | 0.8356 | |
|
| 0.103 | 73.984 | 2312 | 0.7159 | 0.8356 | |
|
| 0.0974 | 74.976 | 2343 | 0.7324 | 0.8266 | |
|
| 0.1049 | 76.0 | 2375 | 0.7397 | 0.8311 | |
|
| 0.097 | 76.992 | 2406 | 0.7529 | 0.8176 | |
|
| 0.0816 | 77.984 | 2437 | 0.7175 | 0.8423 | |
|
| 0.0902 | 78.976 | 2468 | 0.7745 | 0.8288 | |
|
| 0.0827 | 80.0 | 2500 | 0.7017 | 0.8423 | |
|
| 0.0818 | 80.992 | 2531 | 0.7712 | 0.8243 | |
|
| 0.076 | 81.984 | 2562 | 0.7341 | 0.8423 | |
|
| 0.0837 | 82.976 | 2593 | 0.7242 | 0.8491 | |
|
| 0.0743 | 84.0 | 2625 | 0.6999 | 0.8446 | |
|
| 0.0552 | 84.992 | 2656 | 0.6875 | 0.8401 | |
|
| 0.0762 | 85.984 | 2687 | 0.6743 | 0.8581 | |
|
| 0.0742 | 86.976 | 2718 | 0.7027 | 0.8446 | |
|
| 0.0708 | 88.0 | 2750 | 0.7367 | 0.8356 | |
|
| 0.086 | 88.992 | 2781 | 0.6905 | 0.8401 | |
|
| 0.0575 | 89.984 | 2812 | 0.7041 | 0.8423 | |
|
| 0.0733 | 90.976 | 2843 | 0.6465 | 0.8423 | |
|
| 0.0701 | 92.0 | 2875 | 0.7066 | 0.8401 | |
|
| 0.0782 | 92.992 | 2906 | 0.6955 | 0.8243 | |
|
| 0.0754 | 93.984 | 2937 | 0.6836 | 0.8468 | |
|
| 0.0545 | 94.976 | 2968 | 0.7290 | 0.8288 | |
|
| 0.0913 | 96.0 | 3000 | 0.7665 | 0.8266 | |
|
| 0.0816 | 96.992 | 3031 | 0.7661 | 0.8311 | |
|
| 0.0696 | 97.984 | 3062 | 0.6921 | 0.8356 | |
|
| 0.0627 | 98.976 | 3093 | 0.7070 | 0.8446 | |
|
| 0.0562 | 100.0 | 3125 | 0.7442 | 0.8401 | |
|
| 0.0742 | 100.992 | 3156 | 0.7000 | 0.8423 | |
|
| 0.0545 | 101.984 | 3187 | 0.7312 | 0.8401 | |
|
| 0.0635 | 102.976 | 3218 | 0.7231 | 0.8491 | |
|
| 0.0608 | 104.0 | 3250 | 0.7332 | 0.8333 | |
|
| 0.0769 | 104.992 | 3281 | 0.7328 | 0.8356 | |
|
| 0.057 | 105.984 | 3312 | 0.6954 | 0.8378 | |
|
| 0.0447 | 106.976 | 3343 | 0.7006 | 0.8423 | |
|
| 0.0629 | 108.0 | 3375 | 0.7149 | 0.8423 | |
|
| 0.0394 | 108.992 | 3406 | 0.7469 | 0.8378 | |
|
| 0.0602 | 109.984 | 3437 | 0.7274 | 0.8468 | |
|
| 0.0635 | 110.976 | 3468 | 0.7495 | 0.8446 | |
|
| 0.0565 | 112.0 | 3500 | 0.7885 | 0.8401 | |
|
| 0.035 | 112.992 | 3531 | 0.7178 | 0.8468 | |
|
| 0.0604 | 113.984 | 3562 | 0.7574 | 0.8356 | |
|
| 0.0507 | 114.976 | 3593 | 0.7901 | 0.8266 | |
|
| 0.05 | 116.0 | 3625 | 0.7730 | 0.8198 | |
|
| 0.0465 | 116.992 | 3656 | 0.7967 | 0.8401 | |
|
| 0.042 | 117.984 | 3687 | 0.7767 | 0.8423 | |
|
| 0.0609 | 118.976 | 3718 | 0.7872 | 0.8378 | |
|
| 0.0379 | 120.0 | 3750 | 0.7685 | 0.8514 | |
|
| 0.0579 | 120.992 | 3781 | 0.7709 | 0.8423 | |
|
| 0.0471 | 121.984 | 3812 | 0.7601 | 0.8423 | |
|
| 0.0488 | 122.976 | 3843 | 0.8231 | 0.8356 | |
|
| 0.0531 | 124.0 | 3875 | 0.8016 | 0.8378 | |
|
| 0.0446 | 124.992 | 3906 | 0.7806 | 0.8423 | |
|
| 0.0479 | 125.984 | 3937 | 0.7668 | 0.8378 | |
|
| 0.0525 | 126.976 | 3968 | 0.7874 | 0.8288 | |
|
| 0.0512 | 128.0 | 4000 | 0.7652 | 0.8311 | |
|
| 0.0473 | 128.992 | 4031 | 0.7721 | 0.8356 | |
|
| 0.0579 | 129.984 | 4062 | 0.7607 | 0.8356 | |
|
| 0.0444 | 130.976 | 4093 | 0.7917 | 0.8356 | |
|
| 0.0462 | 132.0 | 4125 | 0.7877 | 0.8333 | |
|
| 0.0483 | 132.992 | 4156 | 0.8122 | 0.8401 | |
|
| 0.042 | 133.984 | 4187 | 0.7956 | 0.8378 | |
|
| 0.0439 | 134.976 | 4218 | 0.8281 | 0.8311 | |
|
| 0.0458 | 136.0 | 4250 | 0.7723 | 0.8446 | |
|
| 0.0307 | 136.992 | 4281 | 0.7686 | 0.8446 | |
|
| 0.0481 | 137.984 | 4312 | 0.7834 | 0.8378 | |
|
| 0.0503 | 138.976 | 4343 | 0.7987 | 0.8378 | |
|
| 0.038 | 140.0 | 4375 | 0.8156 | 0.8311 | |
|
| 0.0472 | 140.992 | 4406 | 0.8030 | 0.8356 | |
|
| 0.0282 | 141.984 | 4437 | 0.7884 | 0.8378 | |
|
| 0.0541 | 142.976 | 4468 | 0.7969 | 0.8311 | |
|
| 0.0415 | 144.0 | 4500 | 0.7899 | 0.8333 | |
|
| 0.0579 | 144.992 | 4531 | 0.7979 | 0.8266 | |
|
| 0.048 | 145.984 | 4562 | 0.7935 | 0.8288 | |
|
| 0.0353 | 146.976 | 4593 | 0.7933 | 0.8288 | |
|
| 0.0438 | 148.0 | 4625 | 0.7916 | 0.8288 | |
|
| 0.0487 | 148.8 | 4650 | 0.7923 | 0.8288 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.44.0 |
|
- Pytorch 2.4.0 |
|
- Datasets 2.21.0 |
|
- Tokenizers 0.19.1 |
|
|