mit-b0_necrosis

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0528
  • Mean Iou: 0.8813
  • Mean Accuracy: 0.9318
  • Overall Accuracy: 0.9815
  • Accuracy Background: 0.9940
  • Accuracy Necrosis: 0.8416
  • Accuracy Root: 0.9598
  • Iou Background: 0.9877
  • Iou Necrosis: 0.7339
  • Iou Root: 0.9224

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 48
  • eval_batch_size: 48
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Necrosis Accuracy Root Iou Background Iou Necrosis Iou Root
1.0774 0.9091 20 1.0862 0.2257 0.4936 0.3667 0.2174 0.3590 0.9045 0.2172 0.0290 0.4308
0.79 1.8182 40 0.8729 0.5114 0.6752 0.8224 0.8068 0.2428 0.9761 0.8053 0.0818 0.6471
0.5765 2.7273 60 0.5984 0.6150 0.7260 0.9117 0.9226 0.2759 0.9794 0.9197 0.1919 0.7335
0.4356 3.6364 80 0.4563 0.7147 0.8218 0.9384 0.9483 0.5489 0.9683 0.9448 0.4062 0.7931
0.4001 4.5455 100 0.3937 0.7578 0.8837 0.9442 0.9511 0.7475 0.9525 0.9478 0.5178 0.8078
0.372 5.4545 120 0.3180 0.7836 0.8850 0.9531 0.9617 0.7345 0.9587 0.9578 0.5651 0.8279
0.2767 6.3636 140 0.2898 0.7942 0.9005 0.9541 0.9612 0.7829 0.9576 0.9574 0.5929 0.8322
0.26 7.2727 160 0.2609 0.7970 0.8839 0.9542 0.9600 0.7185 0.9731 0.9565 0.6054 0.8290
0.2399 8.1818 180 0.2031 0.8365 0.9099 0.9683 0.9782 0.7887 0.9628 0.9730 0.6605 0.8761
0.1764 9.0909 200 0.1796 0.8441 0.9299 0.9724 0.9886 0.8701 0.9311 0.9801 0.6641 0.8882
0.1436 10.0 220 0.1594 0.8520 0.9046 0.9752 0.9886 0.7623 0.9630 0.9813 0.6742 0.9004
0.1332 10.9091 240 0.1391 0.8572 0.9300 0.9755 0.9920 0.8626 0.9354 0.9825 0.6892 0.8998
0.1218 11.8182 260 0.1351 0.8519 0.9349 0.9744 0.9903 0.8812 0.9333 0.9822 0.6776 0.8959
0.1128 12.7273 280 0.1128 0.8644 0.9170 0.9774 0.9930 0.8082 0.9499 0.9833 0.7022 0.9077
0.0997 13.6364 300 0.1102 0.8603 0.9145 0.9763 0.9881 0.7900 0.9654 0.9822 0.6948 0.9038
0.0914 14.5455 320 0.1025 0.8661 0.9256 0.9776 0.9929 0.8378 0.9462 0.9838 0.7067 0.9078
0.0958 15.4545 340 0.0968 0.8653 0.9096 0.9779 0.9931 0.7792 0.9565 0.9835 0.7030 0.9095
0.0864 16.3636 360 0.0903 0.8672 0.9139 0.9784 0.9909 0.7847 0.9660 0.9844 0.7064 0.9109
0.0817 17.2727 380 0.0900 0.8543 0.8939 0.9774 0.9924 0.7229 0.9664 0.9843 0.6714 0.9071
0.0737 18.1818 400 0.0841 0.8661 0.9204 0.9775 0.9908 0.8130 0.9574 0.9834 0.7071 0.9078
0.0828 19.0909 420 0.0858 0.8697 0.9343 0.9786 0.9916 0.8593 0.9521 0.9853 0.7117 0.9120
0.0907 20.0 440 0.0799 0.8706 0.9186 0.9784 0.9945 0.8131 0.9482 0.9840 0.7180 0.9096
0.0776 20.9091 460 0.0769 0.8733 0.9256 0.9796 0.9935 0.8283 0.9551 0.9859 0.7192 0.9149
0.0662 21.8182 480 0.0734 0.8698 0.9184 0.9793 0.9921 0.7996 0.9634 0.9856 0.7094 0.9143
0.071 22.7273 500 0.0727 0.8671 0.9088 0.9791 0.9932 0.7694 0.9637 0.9854 0.7023 0.9135
0.0563 23.6364 520 0.0701 0.8723 0.9325 0.9794 0.9922 0.8505 0.9548 0.9861 0.7168 0.9138
0.0666 24.5455 540 0.0730 0.8678 0.9177 0.9786 0.9889 0.7913 0.9731 0.9849 0.7067 0.9118
0.0662 25.4545 560 0.0677 0.8743 0.9269 0.9795 0.9945 0.8365 0.9498 0.9856 0.7232 0.9142
0.0566 26.3636 580 0.0678 0.8737 0.9329 0.9794 0.9936 0.8561 0.9492 0.9859 0.7214 0.9138
0.0536 27.2727 600 0.0646 0.8745 0.9278 0.9800 0.9930 0.8320 0.9584 0.9866 0.7206 0.9164
0.0736 28.1818 620 0.0671 0.8657 0.9091 0.9790 0.9914 0.7651 0.9707 0.9856 0.6980 0.9134
0.0651 29.0909 640 0.0631 0.8717 0.9166 0.9800 0.9920 0.7889 0.9688 0.9864 0.7123 0.9166
0.0484 30.0 660 0.0627 0.8754 0.9213 0.9803 0.9926 0.8058 0.9655 0.9865 0.7217 0.9180
0.0527 30.9091 680 0.0620 0.8762 0.9316 0.9801 0.9912 0.8395 0.9639 0.9865 0.7251 0.9171
0.0592 31.8182 700 0.0646 0.8708 0.9337 0.9789 0.9947 0.8652 0.9411 0.9859 0.7160 0.9106
0.0591 32.7273 720 0.0608 0.8749 0.9204 0.9805 0.9929 0.8026 0.9656 0.9869 0.7196 0.9181
0.052 33.6364 740 0.0603 0.8773 0.9283 0.9803 0.9942 0.8360 0.9546 0.9865 0.7282 0.9174
0.0506 34.5455 760 0.0611 0.8758 0.9367 0.9799 0.9932 0.8654 0.9514 0.9867 0.7252 0.9155
0.0467 35.4545 780 0.0601 0.8771 0.9336 0.9803 0.9944 0.8560 0.9503 0.9868 0.7274 0.9170
0.0466 36.3636 800 0.0578 0.8779 0.9249 0.9810 0.9937 0.8186 0.9623 0.9873 0.7263 0.9202
0.0456 37.2727 820 0.0583 0.8757 0.9191 0.9806 0.9944 0.8020 0.9607 0.9868 0.7217 0.9185
0.0481 38.1818 840 0.0606 0.8642 0.9006 0.9798 0.9940 0.7383 0.9696 0.9869 0.6897 0.9159
0.0601 39.0909 860 0.0566 0.8743 0.9159 0.9808 0.9936 0.7863 0.9677 0.9874 0.7155 0.9198
0.0441 40.0 880 0.0568 0.8799 0.9306 0.9810 0.9934 0.8377 0.9608 0.9873 0.7319 0.9204
0.0416 40.9091 900 0.0569 0.8779 0.9313 0.9806 0.9948 0.8469 0.9520 0.9871 0.7286 0.9180
0.0489 41.8182 920 0.0557 0.8797 0.9288 0.9813 0.9940 0.8318 0.9605 0.9876 0.7300 0.9215
0.0477 42.7273 940 0.0564 0.8759 0.9191 0.9809 0.9935 0.7973 0.9666 0.9873 0.7204 0.9200
0.045 43.6364 960 0.0583 0.8774 0.9408 0.9801 0.9910 0.8721 0.9592 0.9868 0.7284 0.9171
0.0402 44.5455 980 0.0561 0.8795 0.9368 0.9808 0.9951 0.8664 0.9488 0.9874 0.7321 0.9190
0.0463 45.4545 1000 0.0557 0.8761 0.9182 0.9811 0.9928 0.7907 0.9711 0.9876 0.7198 0.9209
0.0356 46.3636 1020 0.0559 0.8793 0.9308 0.9810 0.9945 0.8421 0.9558 0.9874 0.7306 0.9199
0.0446 47.2727 1040 0.0569 0.8749 0.9249 0.9799 0.9963 0.8323 0.9461 0.9863 0.7233 0.9151
0.041 48.1818 1060 0.0561 0.8772 0.9299 0.9807 0.9931 0.8360 0.9606 0.9872 0.7246 0.9198
0.042 49.0909 1080 0.0543 0.8815 0.9339 0.9815 0.9931 0.8457 0.9630 0.9880 0.7339 0.9226
0.0502 50.0 1100 0.0557 0.8742 0.9168 0.9809 0.9937 0.7896 0.9671 0.9876 0.7148 0.9203
0.0403 50.9091 1120 0.0552 0.8771 0.9231 0.9810 0.9917 0.8061 0.9716 0.9873 0.7235 0.9205
0.0412 51.8182 1140 0.0544 0.8779 0.9203 0.9811 0.9951 0.8056 0.9603 0.9873 0.7257 0.9205
0.0426 52.7273 1160 0.0539 0.8797 0.9310 0.9811 0.9943 0.8420 0.9568 0.9874 0.7311 0.9204
0.0423 53.6364 1180 0.0538 0.8795 0.9291 0.9813 0.9940 0.8325 0.9610 0.9878 0.7289 0.9217
0.0302 54.5455 1200 0.0545 0.8778 0.9207 0.9812 0.9940 0.8030 0.9652 0.9875 0.7246 0.9213
0.0563 55.4545 1220 0.0546 0.8786 0.9303 0.9812 0.9935 0.8360 0.9613 0.9878 0.7267 0.9213
0.0462 56.3636 1240 0.0544 0.8758 0.9166 0.9811 0.9939 0.7887 0.9673 0.9874 0.7191 0.9210
0.0323 57.2727 1260 0.0549 0.8784 0.9242 0.9812 0.9924 0.8105 0.9697 0.9875 0.7261 0.9215
0.036 58.1818 1280 0.0528 0.8809 0.9357 0.9814 0.9939 0.8554 0.9577 0.9880 0.7326 0.9220
0.0368 59.0909 1300 0.0553 0.8770 0.9240 0.9809 0.9929 0.8131 0.9662 0.9872 0.7229 0.9208
0.0406 60.0 1320 0.0549 0.8726 0.9119 0.9809 0.9947 0.7753 0.9658 0.9875 0.7101 0.9203
0.0316 60.9091 1340 0.0545 0.8788 0.9244 0.9811 0.9942 0.8177 0.9612 0.9870 0.7280 0.9213
0.0343 61.8182 1360 0.0530 0.8808 0.9294 0.9815 0.9937 0.8317 0.9628 0.9877 0.7321 0.9225
0.0342 62.7273 1380 0.0535 0.8801 0.9315 0.9813 0.9934 0.8396 0.9614 0.9875 0.7306 0.9221
0.0515 63.6364 1400 0.0543 0.8794 0.9299 0.9812 0.9933 0.8340 0.9623 0.9873 0.7290 0.9219
0.0312 64.5455 1420 0.0533 0.8798 0.9312 0.9813 0.9944 0.8419 0.9573 0.9876 0.7302 0.9216
0.0359 65.4545 1440 0.0548 0.8785 0.9351 0.9809 0.9928 0.8527 0.9597 0.9874 0.7273 0.9207
0.0333 66.3636 1460 0.0521 0.8813 0.9276 0.9817 0.9948 0.8276 0.9604 0.9878 0.7327 0.9233
0.0358 67.2727 1480 0.0531 0.8797 0.9284 0.9814 0.9939 0.8292 0.9622 0.9878 0.7290 0.9221
0.0341 68.1818 1500 0.0528 0.8813 0.9319 0.9815 0.9937 0.8410 0.9611 0.9877 0.7336 0.9226
0.0316 69.0909 1520 0.0551 0.8763 0.9212 0.9810 0.9918 0.7992 0.9727 0.9875 0.7209 0.9206
0.0261 70.0 1540 0.0534 0.8803 0.9286 0.9815 0.9930 0.8270 0.9659 0.9877 0.7308 0.9225
0.0331 70.9091 1560 0.0551 0.8764 0.9190 0.9810 0.9927 0.7937 0.9705 0.9874 0.7210 0.9209
0.0398 71.8182 1580 0.0547 0.8758 0.9183 0.9811 0.9930 0.7917 0.9701 0.9876 0.7186 0.9212
0.0368 72.7273 1600 0.0524 0.8804 0.9280 0.9814 0.9943 0.8288 0.9609 0.9877 0.7312 0.9222
0.0368 73.6364 1620 0.0526 0.8811 0.9268 0.9817 0.9939 0.8217 0.9648 0.9879 0.7320 0.9234
0.0319 74.5455 1640 0.0528 0.8812 0.9337 0.9814 0.9934 0.8464 0.9612 0.9878 0.7335 0.9223
0.0321 75.4545 1660 0.0534 0.8802 0.9299 0.9813 0.9938 0.8349 0.9611 0.9876 0.7310 0.9221
0.0339 76.3636 1680 0.0539 0.8794 0.9283 0.9813 0.9927 0.8256 0.9667 0.9877 0.7285 0.9221
0.0289 77.2727 1700 0.0530 0.8795 0.9317 0.9813 0.9940 0.8421 0.9589 0.9878 0.7289 0.9218
0.0355 78.1818 1720 0.0540 0.8797 0.9348 0.9811 0.9940 0.8546 0.9559 0.9875 0.7307 0.9209
0.037 79.0909 1740 0.0531 0.8805 0.9360 0.9812 0.9932 0.8554 0.9594 0.9877 0.7320 0.9217
0.0354 80.0 1760 0.0530 0.8808 0.9319 0.9813 0.9946 0.8446 0.9566 0.9876 0.7334 0.9215
0.0335 80.9091 1780 0.0528 0.8805 0.9359 0.9813 0.9932 0.8549 0.9597 0.9878 0.7319 0.9217
0.0311 81.8182 1800 0.0529 0.8809 0.9361 0.9813 0.9933 0.8555 0.9597 0.9878 0.7328 0.9221
0.0395 82.7273 1820 0.0543 0.8772 0.9413 0.9805 0.9947 0.8838 0.9454 0.9875 0.7264 0.9176
0.0424 83.6364 1840 0.0540 0.8791 0.9403 0.9809 0.9934 0.8733 0.9541 0.9877 0.7294 0.9203
0.035 84.5455 1860 0.0522 0.8814 0.9275 0.9817 0.9942 0.8255 0.9629 0.9879 0.7331 0.9232
0.0418 85.4545 1880 0.0522 0.8803 0.9285 0.9815 0.9941 0.8297 0.9617 0.9879 0.7305 0.9225
0.0365 86.3636 1900 0.0536 0.8802 0.9378 0.9812 0.9935 0.8631 0.9569 0.9878 0.7315 0.9213
0.0344 87.2727 1920 0.0532 0.8809 0.9364 0.9812 0.9934 0.8580 0.9579 0.9875 0.7341 0.9211
0.0257 88.1818 1940 0.0540 0.8775 0.9189 0.9814 0.9932 0.7925 0.9710 0.9879 0.7223 0.9223
0.0333 89.0909 1960 0.0534 0.8797 0.9253 0.9814 0.9926 0.8140 0.9693 0.9876 0.7289 0.9226
0.0407 90.0 1980 0.0531 0.8818 0.9335 0.9814 0.9935 0.8463 0.9608 0.9876 0.7354 0.9225
0.031 90.9091 2000 0.0529 0.8822 0.9345 0.9815 0.9935 0.8496 0.9604 0.9876 0.7366 0.9224
0.0371 91.8182 2020 0.0525 0.8810 0.9316 0.9815 0.9934 0.8388 0.9628 0.9878 0.7325 0.9226
0.0418 92.7273 2040 0.0526 0.8809 0.9302 0.9815 0.9932 0.8329 0.9644 0.9878 0.7321 0.9229
0.0337 93.6364 2060 0.0530 0.8808 0.9279 0.9815 0.9936 0.8259 0.9641 0.9877 0.7318 0.9228
0.0336 94.5455 2080 0.0525 0.8810 0.9289 0.9815 0.9944 0.8320 0.9602 0.9877 0.7331 0.9222
0.0343 95.4545 2100 0.0527 0.8811 0.9285 0.9816 0.9937 0.8279 0.9639 0.9878 0.7323 0.9233
0.0337 96.3636 2120 0.0535 0.8809 0.9311 0.9815 0.9933 0.8367 0.9633 0.9877 0.7322 0.9228
0.0297 97.2727 2140 0.0527 0.8804 0.9292 0.9815 0.9937 0.8312 0.9627 0.9877 0.7308 0.9228
0.0302 98.1818 2160 0.0529 0.8790 0.9227 0.9815 0.9941 0.8084 0.9655 0.9878 0.7265 0.9228
0.037 99.0909 2180 0.0534 0.8814 0.9331 0.9815 0.9936 0.8448 0.9610 0.9877 0.7338 0.9226
0.0591 100.0 2200 0.0520 0.8812 0.9305 0.9816 0.9940 0.8364 0.9611 0.9878 0.7330 0.9229
0.0294 100.9091 2220 0.0538 0.8792 0.9235 0.9815 0.9932 0.8091 0.9682 0.9877 0.7270 0.9228
0.0301 101.8182 2240 0.0530 0.8817 0.9333 0.9815 0.9930 0.8434 0.9634 0.9877 0.7347 0.9227
0.0286 102.7273 2260 0.0533 0.8814 0.9297 0.9816 0.9933 0.8310 0.9647 0.9877 0.7334 0.9230
0.0332 103.6364 2280 0.0523 0.8823 0.9335 0.9816 0.9942 0.8474 0.9589 0.9878 0.7360 0.9229
0.0329 104.5455 2300 0.0526 0.8820 0.9324 0.9816 0.9942 0.8436 0.9593 0.9878 0.7354 0.9228
0.0295 105.4545 2320 0.0528 0.8816 0.9326 0.9814 0.9945 0.8461 0.9571 0.9877 0.7353 0.9220
0.0288 106.3636 2340 0.0526 0.8813 0.9288 0.9816 0.9939 0.8295 0.9629 0.9878 0.7330 0.9230
0.0352 107.2727 2360 0.0527 0.8814 0.9309 0.9816 0.9937 0.8365 0.9625 0.9878 0.7334 0.9230
0.0334 108.1818 2380 0.0526 0.8816 0.9315 0.9816 0.9944 0.8407 0.9592 0.9879 0.7342 0.9229
0.033 109.0909 2400 0.0520 0.8817 0.9300 0.9817 0.9937 0.8328 0.9635 0.9880 0.7337 0.9235
0.0267 110.0 2420 0.0523 0.8814 0.9296 0.9816 0.9941 0.8328 0.9619 0.9879 0.7332 0.9232
0.0281 110.9091 2440 0.0527 0.8817 0.9318 0.9816 0.9939 0.8406 0.9609 0.9878 0.7346 0.9228
0.0329 111.8182 2460 0.0530 0.8816 0.9302 0.9816 0.9941 0.8353 0.9613 0.9877 0.7342 0.9229
0.0374 112.7273 2480 0.0530 0.8812 0.9300 0.9815 0.9936 0.8333 0.9631 0.9877 0.7330 0.9228
0.0298 113.6364 2500 0.0530 0.8811 0.9290 0.9815 0.9941 0.8314 0.9617 0.9877 0.7330 0.9227
0.028 114.5455 2520 0.0529 0.8812 0.9286 0.9816 0.9936 0.8280 0.9642 0.9878 0.7327 0.9231
0.0292 115.4545 2540 0.0535 0.8811 0.9293 0.9815 0.9932 0.8295 0.9651 0.9877 0.7327 0.9230
0.0326 116.3636 2560 0.0530 0.8814 0.9305 0.9815 0.9939 0.8361 0.9616 0.9877 0.7339 0.9228
0.0312 117.2727 2580 0.0532 0.8814 0.9306 0.9815 0.9937 0.8359 0.9623 0.9877 0.7338 0.9227
0.0292 118.1818 2600 0.0533 0.8811 0.9291 0.9815 0.9937 0.8303 0.9633 0.9877 0.7328 0.9228
0.0287 119.0909 2620 0.0533 0.8812 0.9306 0.9815 0.9937 0.8362 0.9619 0.9877 0.7333 0.9226
0.0352 120.0 2640 0.0528 0.8813 0.9318 0.9815 0.9940 0.8416 0.9598 0.9877 0.7339 0.9224

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
21
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mujerry/mit-b0_necrosis

Base model

nvidia/mit-b0
Finetuned
(387)
this model