segformer-finetuned-segments-riceleafdisease-dec-18
This model is a fine-tuned version of nvidia/mit-b0 on the nancyalarabawy/RiceLeafDiseases dataset. It achieves the following results on the evaluation set:
- Loss: 0.0654
- Mean Iou: 0.8022
- Mean Accuracy: 0.8495
- Overall Accuracy: 0.9793
- Accuracy Unlabelled: nan
- Accuracy Healthy: 0.9379
- Accuracy Disease: 0.6144
- Accuracy Background: 0.9962
- Iou Unlabelled: nan
- Iou Healthy: 0.8897
- Iou Disease: 0.5318
- Iou Background: 0.9851
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabelled | Accuracy Healthy | Accuracy Disease | Accuracy Background | Iou Unlabelled | Iou Healthy | Iou Disease | Iou Background |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.0053 | 0.33 | 20 | 1.2067 | 0.3883 | 0.6563 | 0.8803 | nan | 0.9886 | 0.0986 | 0.8816 | 0.0 | 0.6303 | 0.0418 | 0.8810 |
0.8256 | 0.67 | 40 | 0.8465 | 0.4445 | 0.7343 | 0.9192 | nan | 0.9352 | 0.3360 | 0.9317 | 0.0 | 0.7090 | 0.1390 | 0.9300 |
0.4785 | 1.0 | 60 | 0.5391 | 0.4488 | 0.6814 | 0.9466 | nan | 0.9824 | 0.0991 | 0.9626 | 0.0 | 0.7520 | 0.0852 | 0.9579 |
0.5875 | 1.33 | 80 | 0.6045 | 0.6251 | 0.7087 | 0.9510 | nan | 0.9790 | 0.1807 | 0.9664 | nan | 0.7718 | 0.1407 | 0.9627 |
0.5354 | 1.67 | 100 | 0.3814 | 0.6633 | 0.7242 | 0.9596 | nan | 0.9313 | 0.2584 | 0.9831 | nan | 0.7937 | 0.2267 | 0.9694 |
0.5146 | 2.0 | 120 | 0.3455 | 0.6755 | 0.7307 | 0.9650 | nan | 0.9507 | 0.2552 | 0.9862 | nan | 0.8187 | 0.2321 | 0.9756 |
0.5212 | 2.33 | 140 | 0.4321 | 0.6471 | 0.7169 | 0.9598 | nan | 0.9798 | 0.1946 | 0.9764 | nan | 0.7934 | 0.1752 | 0.9727 |
0.4044 | 2.67 | 160 | 0.1609 | 0.6607 | 0.7065 | 0.9607 | nan | 0.9067 | 0.2230 | 0.9898 | nan | 0.7994 | 0.2145 | 0.9681 |
0.4572 | 3.0 | 180 | 0.2172 | 0.6882 | 0.7268 | 0.9657 | nan | 0.9050 | 0.2808 | 0.9946 | nan | 0.8254 | 0.2667 | 0.9724 |
0.2676 | 3.33 | 200 | 0.2950 | 0.6926 | 0.7451 | 0.9690 | nan | 0.9674 | 0.2806 | 0.9874 | nan | 0.8332 | 0.2641 | 0.9804 |
0.2991 | 3.67 | 220 | 0.2523 | 0.6976 | 0.7411 | 0.9695 | nan | 0.9337 | 0.2959 | 0.9937 | nan | 0.8401 | 0.2734 | 0.9792 |
0.5168 | 4.0 | 240 | 0.1013 | 0.6599 | 0.6969 | 0.9651 | nan | 0.9065 | 0.1880 | 0.9961 | nan | 0.8226 | 0.1835 | 0.9735 |
0.3117 | 4.33 | 260 | 0.2323 | 0.6951 | 0.7424 | 0.9696 | nan | 0.9496 | 0.2863 | 0.9912 | nan | 0.8409 | 0.2644 | 0.9800 |
0.2888 | 4.67 | 280 | 0.1264 | 0.7466 | 0.7901 | 0.9736 | nan | 0.9277 | 0.4470 | 0.9957 | nan | 0.8582 | 0.3998 | 0.9817 |
0.1684 | 5.0 | 300 | 0.1291 | 0.7646 | 0.8119 | 0.9758 | nan | 0.9373 | 0.5033 | 0.9951 | nan | 0.8693 | 0.4406 | 0.9837 |
0.2041 | 5.33 | 320 | 0.1804 | 0.7720 | 0.8213 | 0.9773 | nan | 0.9462 | 0.5229 | 0.9948 | nan | 0.8732 | 0.4564 | 0.9863 |
0.179 | 5.67 | 340 | 0.1381 | 0.7937 | 0.8560 | 0.9773 | nan | 0.9280 | 0.6454 | 0.9948 | nan | 0.8727 | 0.5235 | 0.9849 |
0.1444 | 6.0 | 360 | 0.1671 | 0.7393 | 0.7972 | 0.9727 | nan | 0.9750 | 0.4301 | 0.9866 | nan | 0.8485 | 0.3862 | 0.9832 |
0.2365 | 6.33 | 380 | 0.1272 | 0.7813 | 0.8275 | 0.9771 | nan | 0.9354 | 0.5511 | 0.9958 | nan | 0.8722 | 0.4870 | 0.9848 |
0.2216 | 6.67 | 400 | 0.0907 | 0.7923 | 0.8358 | 0.9775 | nan | 0.9224 | 0.5875 | 0.9976 | nan | 0.8761 | 0.5171 | 0.9837 |
0.1437 | 7.0 | 420 | 0.0782 | 0.7715 | 0.8148 | 0.9732 | nan | 0.9167 | 0.5329 | 0.9948 | nan | 0.8591 | 0.4777 | 0.9778 |
0.1065 | 7.33 | 440 | 0.0877 | 0.7537 | 0.7917 | 0.9725 | nan | 0.9187 | 0.4610 | 0.9955 | nan | 0.8549 | 0.4281 | 0.9781 |
0.2535 | 7.67 | 460 | 0.0784 | 0.7457 | 0.7810 | 0.9723 | nan | 0.9078 | 0.4374 | 0.9979 | nan | 0.8593 | 0.4003 | 0.9776 |
0.108 | 8.0 | 480 | 0.1003 | 0.7544 | 0.7984 | 0.9759 | nan | 0.9563 | 0.4455 | 0.9934 | nan | 0.8703 | 0.4084 | 0.9844 |
0.0884 | 8.33 | 500 | 0.0744 | 0.7689 | 0.8108 | 0.9753 | nan | 0.9335 | 0.5039 | 0.9952 | nan | 0.8694 | 0.4557 | 0.9816 |
0.1935 | 8.67 | 520 | 0.1047 | 0.7954 | 0.8373 | 0.9800 | nan | 0.9482 | 0.5672 | 0.9965 | nan | 0.8878 | 0.5111 | 0.9874 |
0.268 | 9.0 | 540 | 0.1281 | 0.8058 | 0.8559 | 0.9816 | nan | 0.9563 | 0.6158 | 0.9957 | nan | 0.8945 | 0.5328 | 0.9900 |
0.0779 | 9.33 | 560 | 0.1393 | 0.7680 | 0.8126 | 0.9774 | nan | 0.9645 | 0.4806 | 0.9928 | nan | 0.8763 | 0.4418 | 0.9859 |
0.1295 | 9.67 | 580 | 0.0749 | 0.7433 | 0.7861 | 0.9741 | nan | 0.9572 | 0.4092 | 0.9920 | nan | 0.8595 | 0.3877 | 0.9826 |
0.6322 | 10.0 | 600 | 0.0792 | 0.7825 | 0.8233 | 0.9776 | nan | 0.9419 | 0.5322 | 0.9957 | nan | 0.8757 | 0.4869 | 0.9849 |
0.1491 | 10.33 | 620 | 0.0685 | 0.7805 | 0.8265 | 0.9760 | nan | 0.9320 | 0.5525 | 0.9950 | nan | 0.8753 | 0.4845 | 0.9816 |
0.0876 | 10.67 | 640 | 0.1347 | 0.7672 | 0.8134 | 0.9783 | nan | 0.9713 | 0.4762 | 0.9928 | nan | 0.8825 | 0.4317 | 0.9873 |
0.3076 | 11.0 | 660 | 0.0989 | 0.7737 | 0.8120 | 0.9745 | nan | 0.9152 | 0.5239 | 0.9970 | nan | 0.8700 | 0.4723 | 0.9788 |
0.08 | 11.33 | 680 | 0.0923 | 0.7914 | 0.8350 | 0.9766 | nan | 0.9212 | 0.5869 | 0.9968 | nan | 0.8780 | 0.5150 | 0.9813 |
0.0816 | 11.67 | 700 | 0.0878 | 0.7948 | 0.8392 | 0.9790 | nan | 0.9446 | 0.5773 | 0.9957 | nan | 0.8891 | 0.5101 | 0.9851 |
0.057 | 12.0 | 720 | 0.0824 | 0.8032 | 0.8537 | 0.9792 | nan | 0.9496 | 0.6175 | 0.9940 | nan | 0.8874 | 0.5368 | 0.9855 |
0.0881 | 12.33 | 740 | 0.0802 | 0.7694 | 0.8051 | 0.9766 | nan | 0.9349 | 0.4833 | 0.9970 | nan | 0.8792 | 0.4466 | 0.9825 |
0.1084 | 12.67 | 760 | 0.1033 | 0.8113 | 0.8662 | 0.9793 | nan | 0.9392 | 0.6646 | 0.9947 | nan | 0.8858 | 0.5626 | 0.9854 |
0.0703 | 13.0 | 780 | 0.0889 | 0.7795 | 0.8234 | 0.9765 | nan | 0.9385 | 0.5367 | 0.9949 | nan | 0.8775 | 0.4785 | 0.9824 |
0.1332 | 13.33 | 800 | 0.0803 | 0.7859 | 0.8332 | 0.9779 | nan | 0.9494 | 0.5562 | 0.9940 | nan | 0.8857 | 0.4881 | 0.9840 |
0.0872 | 13.67 | 820 | 0.1034 | 0.7741 | 0.8173 | 0.9788 | nan | 0.9666 | 0.4913 | 0.9939 | nan | 0.8860 | 0.4492 | 0.9871 |
0.0475 | 14.0 | 840 | 0.0728 | 0.7826 | 0.8295 | 0.9771 | nan | 0.9400 | 0.5536 | 0.9948 | nan | 0.8820 | 0.4829 | 0.9829 |
0.0569 | 14.33 | 860 | 0.0940 | 0.7794 | 0.8236 | 0.9786 | nan | 0.9671 | 0.5108 | 0.9930 | nan | 0.8824 | 0.4690 | 0.9867 |
0.7 | 14.67 | 880 | 0.0753 | 0.8024 | 0.8459 | 0.9797 | nan | 0.9443 | 0.5974 | 0.9961 | nan | 0.8905 | 0.5309 | 0.9858 |
0.0805 | 15.0 | 900 | 0.0738 | 0.8145 | 0.8720 | 0.9810 | nan | 0.9542 | 0.6679 | 0.9940 | nan | 0.8929 | 0.5622 | 0.9884 |
0.601 | 15.33 | 920 | 0.0970 | 0.8117 | 0.8614 | 0.9822 | nan | 0.9685 | 0.6216 | 0.9941 | nan | 0.8990 | 0.5461 | 0.9900 |
0.0844 | 15.67 | 940 | 0.0732 | 0.7691 | 0.8128 | 0.9755 | nan | 0.9479 | 0.4975 | 0.9930 | nan | 0.8731 | 0.4529 | 0.9814 |
0.1097 | 16.0 | 960 | 0.0622 | 0.8170 | 0.8721 | 0.9798 | nan | 0.9321 | 0.6881 | 0.9960 | nan | 0.8893 | 0.5761 | 0.9855 |
0.1446 | 16.33 | 980 | 0.0675 | 0.7983 | 0.8491 | 0.9797 | nan | 0.9621 | 0.5921 | 0.9930 | nan | 0.8900 | 0.5180 | 0.9868 |
0.3657 | 16.67 | 1000 | 0.0696 | 0.7900 | 0.8335 | 0.9777 | nan | 0.9371 | 0.5676 | 0.9957 | nan | 0.8836 | 0.5032 | 0.9831 |
0.2767 | 17.0 | 1020 | 0.0883 | 0.7599 | 0.8009 | 0.9715 | nan | 0.9063 | 0.5008 | 0.9955 | nan | 0.8580 | 0.4468 | 0.9750 |
0.1155 | 17.33 | 1040 | 0.0720 | 0.7929 | 0.8337 | 0.9797 | nan | 0.9527 | 0.5526 | 0.9958 | nan | 0.8882 | 0.5034 | 0.9870 |
0.0765 | 17.67 | 1060 | 0.0733 | 0.7843 | 0.8297 | 0.9767 | nan | 0.9395 | 0.5553 | 0.9944 | nan | 0.8798 | 0.4911 | 0.9821 |
0.1484 | 18.0 | 1080 | 0.0833 | 0.7904 | 0.8384 | 0.9788 | nan | 0.9595 | 0.5626 | 0.9932 | nan | 0.8870 | 0.4985 | 0.9856 |
0.1017 | 18.33 | 1100 | 0.0928 | 0.7866 | 0.8304 | 0.9799 | nan | 0.9707 | 0.5269 | 0.9935 | nan | 0.8888 | 0.4829 | 0.9881 |
0.0745 | 18.67 | 1120 | 0.0662 | 0.8084 | 0.8545 | 0.9806 | nan | 0.9472 | 0.6203 | 0.9961 | nan | 0.8947 | 0.5434 | 0.9870 |
0.093 | 19.0 | 1140 | 0.0673 | 0.8026 | 0.8510 | 0.9799 | nan | 0.9517 | 0.6065 | 0.9947 | nan | 0.8926 | 0.5290 | 0.9863 |
0.0475 | 19.33 | 1160 | 0.0818 | 0.7982 | 0.8447 | 0.9805 | nan | 0.9632 | 0.5768 | 0.9942 | nan | 0.8939 | 0.5130 | 0.9878 |
0.051 | 19.67 | 1180 | 0.0690 | 0.7931 | 0.8368 | 0.9800 | nan | 0.9578 | 0.5573 | 0.9951 | nan | 0.8924 | 0.4996 | 0.9873 |
0.0432 | 20.0 | 1200 | 0.0654 | 0.8022 | 0.8495 | 0.9793 | nan | 0.9379 | 0.6144 | 0.9962 | nan | 0.8897 | 0.5318 | 0.9851 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 15