ebayes/amazonas-fern-latest
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.2619
- Accuracy: 0.7969
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 10
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
4.9425 | 1.0 | 516 | 4.6442 | 0.2450 |
4.4394 | 2.0 | 1032 | 4.1936 | 0.3271 |
4.0242 | 3.0 | 1548 | 3.8147 | 0.3891 |
3.6368 | 4.0 | 2064 | 3.4881 | 0.4403 |
3.3168 | 5.0 | 2580 | 3.1849 | 0.4760 |
2.9583 | 6.0 | 3096 | 2.9087 | 0.5054 |
2.6652 | 7.0 | 3612 | 2.6435 | 0.5271 |
2.3696 | 8.0 | 4128 | 2.4352 | 0.5442 |
2.1322 | 9.0 | 4644 | 2.2335 | 0.5814 |
1.8776 | 10.0 | 5160 | 2.0674 | 0.5922 |
1.6773 | 11.0 | 5676 | 1.9474 | 0.6093 |
1.5136 | 12.0 | 6192 | 1.8081 | 0.6264 |
1.3341 | 13.0 | 6708 | 1.6931 | 0.6419 |
1.2215 | 14.0 | 7224 | 1.5986 | 0.6481 |
1.0886 | 15.0 | 7740 | 1.5309 | 0.6744 |
0.9762 | 16.0 | 8256 | 1.4605 | 0.6760 |
0.8322 | 17.0 | 8772 | 1.4038 | 0.6946 |
0.7767 | 18.0 | 9288 | 1.3404 | 0.6961 |
0.6943 | 19.0 | 9804 | 1.3143 | 0.7085 |
0.6011 | 20.0 | 10320 | 1.2708 | 0.7256 |
0.5585 | 21.0 | 10836 | 1.2777 | 0.7101 |
0.5014 | 22.0 | 11352 | 1.2744 | 0.7147 |
0.4704 | 23.0 | 11868 | 1.1907 | 0.7302 |
0.3934 | 24.0 | 12384 | 1.1748 | 0.7442 |
0.3616 | 25.0 | 12900 | 1.1897 | 0.7364 |
0.3274 | 26.0 | 13416 | 1.1648 | 0.7426 |
0.3062 | 27.0 | 13932 | 1.1899 | 0.7333 |
0.2726 | 28.0 | 14448 | 1.1192 | 0.7488 |
0.2425 | 29.0 | 14964 | 1.0887 | 0.7643 |
0.2316 | 30.0 | 15480 | 1.0957 | 0.7674 |
0.2321 | 31.0 | 15996 | 1.1206 | 0.7504 |
0.1828 | 32.0 | 16512 | 1.1901 | 0.7426 |
0.1675 | 33.0 | 17028 | 1.1317 | 0.7566 |
0.1572 | 34.0 | 17544 | 1.1530 | 0.7380 |
0.1453 | 35.0 | 18060 | 1.1519 | 0.7550 |
0.1385 | 36.0 | 18576 | 1.1358 | 0.7690 |
0.138 | 37.0 | 19092 | 1.1481 | 0.7628 |
0.1244 | 38.0 | 19608 | 1.1959 | 0.7442 |
0.1376 | 39.0 | 20124 | 1.1581 | 0.7659 |
0.107 | 40.0 | 20640 | 1.1979 | 0.7628 |
0.1219 | 41.0 | 21156 | 1.1915 | 0.7566 |
0.1105 | 42.0 | 21672 | 1.2247 | 0.7550 |
0.127 | 43.0 | 22188 | 1.1439 | 0.7736 |
0.1022 | 44.0 | 22704 | 1.1729 | 0.7535 |
0.1158 | 45.0 | 23220 | 1.2010 | 0.7535 |
0.1045 | 46.0 | 23736 | 1.2051 | 0.7519 |
0.103 | 47.0 | 24252 | 1.2006 | 0.7643 |
0.0967 | 48.0 | 24768 | 1.1888 | 0.7581 |
0.0963 | 49.0 | 25284 | 1.1814 | 0.7690 |
0.0923 | 50.0 | 25800 | 1.1566 | 0.7705 |
0.1071 | 51.0 | 26316 | 1.2239 | 0.7566 |
0.081 | 52.0 | 26832 | 1.2263 | 0.7581 |
0.0922 | 53.0 | 27348 | 1.1442 | 0.7628 |
0.0787 | 54.0 | 27864 | 1.2122 | 0.7705 |
0.0952 | 55.0 | 28380 | 1.3165 | 0.7504 |
0.1057 | 56.0 | 28896 | 1.2726 | 0.7550 |
0.1123 | 57.0 | 29412 | 1.2554 | 0.7597 |
0.0703 | 58.0 | 29928 | 1.1242 | 0.7752 |
0.094 | 59.0 | 30444 | 1.1734 | 0.7767 |
0.0699 | 60.0 | 30960 | 1.2493 | 0.7550 |
0.0731 | 61.0 | 31476 | 1.2414 | 0.7643 |
0.0888 | 62.0 | 31992 | 1.3430 | 0.7473 |
0.0737 | 63.0 | 32508 | 1.3174 | 0.7566 |
0.0825 | 64.0 | 33024 | 1.3129 | 0.7597 |
0.0821 | 65.0 | 33540 | 1.2509 | 0.7736 |
0.0817 | 66.0 | 34056 | 1.2020 | 0.7736 |
0.0754 | 67.0 | 34572 | 1.2447 | 0.7721 |
0.0854 | 68.0 | 35088 | 1.2626 | 0.7767 |
0.0755 | 69.0 | 35604 | 1.2202 | 0.7814 |
0.0847 | 70.0 | 36120 | 1.2525 | 0.7612 |
0.068 | 71.0 | 36636 | 1.2940 | 0.7674 |
0.0648 | 72.0 | 37152 | 1.2585 | 0.7736 |
0.0768 | 73.0 | 37668 | 1.2878 | 0.7597 |
0.0771 | 74.0 | 38184 | 1.2685 | 0.7659 |
0.0749 | 75.0 | 38700 | 1.2860 | 0.7721 |
0.0615 | 76.0 | 39216 | 1.3085 | 0.7643 |
0.0677 | 77.0 | 39732 | 1.3011 | 0.7674 |
0.0673 | 78.0 | 40248 | 1.2077 | 0.7814 |
0.0696 | 79.0 | 40764 | 1.2118 | 0.7860 |
0.0714 | 80.0 | 41280 | 1.1952 | 0.7767 |
0.0624 | 81.0 | 41796 | 1.2575 | 0.7690 |
0.0604 | 82.0 | 42312 | 1.2816 | 0.7736 |
0.0641 | 83.0 | 42828 | 1.3230 | 0.7643 |
0.0574 | 84.0 | 43344 | 1.2876 | 0.7752 |
0.0621 | 85.0 | 43860 | 1.2576 | 0.7845 |
0.0639 | 86.0 | 44376 | 1.2486 | 0.7705 |
0.0538 | 87.0 | 44892 | 1.2192 | 0.7845 |
0.0518 | 88.0 | 45408 | 1.2171 | 0.7674 |
0.0563 | 89.0 | 45924 | 1.3201 | 0.7581 |
0.0531 | 90.0 | 46440 | 1.2414 | 0.7736 |
0.0431 | 91.0 | 46956 | 1.3059 | 0.7736 |
0.0655 | 92.0 | 47472 | 1.3307 | 0.7566 |
0.0595 | 93.0 | 47988 | 1.2927 | 0.7659 |
0.0707 | 94.0 | 48504 | 1.2667 | 0.7628 |
0.0517 | 95.0 | 49020 | 1.2957 | 0.7597 |
0.0579 | 96.0 | 49536 | 1.3340 | 0.7643 |
0.0492 | 97.0 | 50052 | 1.3588 | 0.7535 |
0.0472 | 98.0 | 50568 | 1.3074 | 0.7612 |
0.0542 | 99.0 | 51084 | 1.2657 | 0.7705 |
0.0689 | 100.0 | 51600 | 1.2943 | 0.7752 |
0.0464 | 101.0 | 52116 | 1.2386 | 0.7953 |
0.0589 | 102.0 | 52632 | 1.2717 | 0.7767 |
0.0488 | 103.0 | 53148 | 1.2678 | 0.7814 |
0.0554 | 104.0 | 53664 | 1.2711 | 0.7783 |
0.0502 | 105.0 | 54180 | 1.2746 | 0.7721 |
0.0383 | 106.0 | 54696 | 1.3002 | 0.7798 |
0.0531 | 107.0 | 55212 | 1.2636 | 0.7891 |
0.0379 | 108.0 | 55728 | 1.3156 | 0.7721 |
0.042 | 109.0 | 56244 | 1.3668 | 0.7674 |
0.0543 | 110.0 | 56760 | 1.2883 | 0.7783 |
0.0522 | 111.0 | 57276 | 1.2913 | 0.7783 |
0.0469 | 112.0 | 57792 | 1.2847 | 0.7767 |
0.0598 | 113.0 | 58308 | 1.2642 | 0.7876 |
0.0472 | 114.0 | 58824 | 1.3264 | 0.7752 |
0.0405 | 115.0 | 59340 | 1.2648 | 0.7891 |
0.0434 | 116.0 | 59856 | 1.3059 | 0.7798 |
0.0481 | 117.0 | 60372 | 1.3373 | 0.7736 |
0.0454 | 118.0 | 60888 | 1.3237 | 0.7736 |
0.0504 | 119.0 | 61404 | 1.2956 | 0.7736 |
0.0495 | 120.0 | 61920 | 1.3504 | 0.7705 |
0.0424 | 121.0 | 62436 | 1.2852 | 0.7891 |
0.0493 | 122.0 | 62952 | 1.2621 | 0.7891 |
0.0421 | 123.0 | 63468 | 1.2755 | 0.7752 |
0.0339 | 124.0 | 63984 | 1.2914 | 0.7891 |
0.0415 | 125.0 | 64500 | 1.2959 | 0.7876 |
0.035 | 126.0 | 65016 | 1.2724 | 0.7891 |
0.0342 | 127.0 | 65532 | 1.2564 | 0.7798 |
0.0411 | 128.0 | 66048 | 1.2493 | 0.7798 |
0.0345 | 129.0 | 66564 | 1.2490 | 0.7891 |
0.0365 | 130.0 | 67080 | 1.2560 | 0.7969 |
0.0304 | 131.0 | 67596 | 1.2466 | 0.7876 |
0.0361 | 132.0 | 68112 | 1.2691 | 0.7953 |
0.0387 | 133.0 | 68628 | 1.2849 | 0.7860 |
0.0361 | 134.0 | 69144 | 1.2731 | 0.7891 |
0.0334 | 135.0 | 69660 | 1.2649 | 0.7907 |
0.0368 | 136.0 | 70176 | 1.2562 | 0.7953 |
0.0395 | 137.0 | 70692 | 1.2851 | 0.7891 |
0.0397 | 138.0 | 71208 | 1.2767 | 0.7891 |
0.0433 | 139.0 | 71724 | 1.2383 | 0.8031 |
0.031 | 140.0 | 72240 | 1.2429 | 0.7984 |
0.0326 | 141.0 | 72756 | 1.2389 | 0.8047 |
0.0369 | 142.0 | 73272 | 1.2475 | 0.8 |
0.0436 | 143.0 | 73788 | 1.2762 | 0.7907 |
0.031 | 144.0 | 74304 | 1.2772 | 0.7891 |
0.0278 | 145.0 | 74820 | 1.2513 | 0.7984 |
0.0345 | 146.0 | 75336 | 1.2639 | 0.7969 |
0.034 | 147.0 | 75852 | 1.2679 | 0.7953 |
0.0331 | 148.0 | 76368 | 1.2682 | 0.7938 |
0.028 | 149.0 | 76884 | 1.2634 | 0.7953 |
0.0356 | 150.0 | 77400 | 1.2619 | 0.7969 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 223