Edit model card

dungeon-maps-seg-v0.0.1

This model is a fine-tuned version of nvidia/mit-b0 on the cephelos/dungeon-maps-seg dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0361
  • Mean Iou: 0.9518
  • Mean Accuracy: 0.9783
  • Overall Accuracy: 0.9893
  • Accuracy Unlabeled: nan
  • Accuracy Room: 0.9923
  • Accuracy Wall: 0.9490
  • Accuracy Outside: 0.9935
  • Iou Unlabeled: nan
  • Iou Room: 0.9857
  • Iou Wall: 0.8788
  • Iou Outside: 0.9911

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Room Accuracy Wall Accuracy Outside Iou Unlabeled Iou Room Iou Wall Iou Outside
0.2922 0.7692 20 0.2745 0.8581 0.9561 0.9598 nan 0.9526 0.9466 0.9690 nan 0.9466 0.6646 0.9632
0.2099 1.5385 40 0.2072 0.8639 0.9584 0.9625 nan 0.9680 0.9472 0.9599 nan 0.9600 0.6732 0.9584
0.2009 2.3077 60 0.1688 0.8968 0.9623 0.9741 nan 0.9718 0.9316 0.9835 nan 0.9649 0.7477 0.9778
0.1258 3.0769 80 0.1482 0.8991 0.9676 0.9745 nan 0.9773 0.9492 0.9762 nan 0.9708 0.7529 0.9736
0.1624 3.8462 100 0.1333 0.9115 0.9682 0.9785 nan 0.9807 0.9410 0.9829 nan 0.9734 0.7817 0.9795
0.1098 4.6154 120 0.1079 0.9173 0.9624 0.9805 nan 0.9859 0.9145 0.9868 nan 0.9753 0.7950 0.9817
0.1629 5.3846 140 0.1041 0.9195 0.9711 0.9806 nan 0.9790 0.9462 0.9881 nan 0.9738 0.8013 0.9833
0.1243 6.1538 160 0.0872 0.9243 0.9675 0.9821 nan 0.9852 0.9288 0.9884 nan 0.9766 0.8125 0.9836
0.0974 6.9231 180 0.0996 0.9217 0.9731 0.9811 nan 0.9754 0.9525 0.9915 nan 0.9717 0.8073 0.9861
0.0861 7.6923 200 0.0798 0.9248 0.9706 0.9821 nan 0.9829 0.9403 0.9886 nan 0.9764 0.8142 0.9836
0.0928 8.4615 220 0.0718 0.9276 0.9740 0.9828 nan 0.9830 0.9507 0.9882 nan 0.9773 0.8209 0.9847
0.0583 9.2308 240 0.0726 0.9240 0.9686 0.9822 nan 0.9870 0.9326 0.9862 nan 0.9789 0.8111 0.9821
0.0886 10.0 260 0.0700 0.9296 0.9740 0.9835 nan 0.9845 0.9491 0.9885 nan 0.9786 0.8250 0.9852
0.1133 10.7692 280 0.0651 0.9322 0.9633 0.9848 nan 0.9912 0.9064 0.9922 nan 0.9794 0.8301 0.9872
0.0821 11.5385 300 0.0616 0.9302 0.9721 0.9836 nan 0.9833 0.9417 0.9912 nan 0.9779 0.8270 0.9857
0.07 12.3077 320 0.0586 0.9394 0.9690 0.9864 nan 0.9896 0.9232 0.9942 nan 0.9810 0.8485 0.9887
0.076 13.0769 340 0.0566 0.9349 0.9651 0.9854 nan 0.9919 0.9113 0.9920 nan 0.9803 0.8365 0.9878
0.0577 13.8462 360 0.0570 0.9378 0.9755 0.9857 nan 0.9850 0.9488 0.9926 nan 0.9797 0.8452 0.9886
0.1261 14.6154 380 0.0548 0.9403 0.9739 0.9864 nan 0.9867 0.9410 0.9939 nan 0.9808 0.8511 0.9891
0.0583 15.3846 400 0.0523 0.9428 0.9736 0.9871 nan 0.9895 0.9379 0.9934 nan 0.9820 0.8566 0.9896
0.0602 16.1538 420 0.0488 0.9409 0.9737 0.9866 nan 0.9899 0.9394 0.9917 nan 0.9820 0.8519 0.9887
0.0728 16.9231 440 0.0504 0.9380 0.9716 0.9860 nan 0.9907 0.9335 0.9905 nan 0.9819 0.8448 0.9873
0.0507 17.6923 460 0.0503 0.9378 0.9739 0.9858 nan 0.9892 0.9424 0.9901 nan 0.9820 0.8445 0.9869
0.077 18.4615 480 0.0474 0.9429 0.9740 0.9871 nan 0.9876 0.9396 0.9949 nan 0.9819 0.8570 0.9897
0.2137 19.2308 500 0.0500 0.9413 0.9763 0.9866 nan 0.9892 0.9489 0.9907 nan 0.9823 0.8532 0.9882
0.0991 20.0 520 0.0459 0.9440 0.9719 0.9875 nan 0.9899 0.9309 0.9950 nan 0.9827 0.8595 0.9898
0.0691 20.7692 540 0.0447 0.9451 0.9743 0.9877 nan 0.9906 0.9390 0.9933 nan 0.9831 0.8623 0.9897
0.0602 21.5385 560 0.0447 0.9462 0.9754 0.9879 nan 0.9885 0.9424 0.9952 nan 0.9828 0.8654 0.9904
0.0469 22.3077 580 0.0429 0.9466 0.9767 0.9879 nan 0.9889 0.9471 0.9940 nan 0.9830 0.8664 0.9903
0.0553 23.0769 600 0.0445 0.9468 0.9722 0.9882 nan 0.9913 0.9301 0.9952 nan 0.9832 0.8666 0.9906
0.0671 23.8462 620 0.0424 0.9455 0.9748 0.9878 nan 0.9900 0.9407 0.9938 nan 0.9833 0.8635 0.9898
0.0431 24.6154 640 0.0417 0.9475 0.9732 0.9883 nan 0.9921 0.9331 0.9943 nan 0.9836 0.8681 0.9907
0.0381 25.3846 660 0.0429 0.9449 0.9763 0.9876 nan 0.9881 0.9467 0.9942 nan 0.9827 0.8620 0.9901
0.0503 26.1538 680 0.0403 0.9471 0.9746 0.9882 nan 0.9924 0.9384 0.9929 nan 0.9841 0.8669 0.9902
0.0685 26.9231 700 0.0410 0.9496 0.9743 0.9888 nan 0.9913 0.9361 0.9957 nan 0.9842 0.8732 0.9912
0.0381 27.6923 720 0.0398 0.9494 0.9771 0.9887 nan 0.9906 0.9466 0.9942 nan 0.9843 0.8729 0.9909
0.0587 28.4615 740 0.0397 0.9500 0.9760 0.9889 nan 0.9913 0.9421 0.9947 nan 0.9843 0.8743 0.9913
0.0573 29.2308 760 0.0402 0.9489 0.9756 0.9887 nan 0.9913 0.9411 0.9945 nan 0.9845 0.8715 0.9908
0.0686 30.0 780 0.0386 0.9499 0.9763 0.9889 nan 0.9914 0.9433 0.9944 nan 0.9844 0.8740 0.9912
0.037 30.7692 800 0.0386 0.9503 0.9752 0.9890 nan 0.9925 0.9387 0.9944 nan 0.9849 0.8748 0.9911
0.0565 31.5385 820 0.0389 0.9497 0.9773 0.9888 nan 0.9898 0.9471 0.9950 nan 0.9840 0.8738 0.9913
0.0405 32.3077 840 0.0383 0.9483 0.9743 0.9886 nan 0.9933 0.9366 0.9930 nan 0.9848 0.8698 0.9903
0.0618 33.0769 860 0.0383 0.9497 0.9757 0.9889 nan 0.9920 0.9408 0.9942 nan 0.9847 0.8734 0.9910
0.0398 33.8462 880 0.0379 0.9494 0.9766 0.9888 nan 0.9917 0.9446 0.9936 nan 0.9846 0.8729 0.9908
0.0488 34.6154 900 0.0376 0.9501 0.9769 0.9889 nan 0.9915 0.9450 0.9941 nan 0.9851 0.8745 0.9907
0.0574 35.3846 920 0.0379 0.9512 0.9762 0.9892 nan 0.9914 0.9419 0.9953 nan 0.9849 0.8773 0.9914
0.0331 36.1538 940 0.0368 0.9514 0.9764 0.9893 nan 0.9921 0.9424 0.9947 nan 0.9852 0.8777 0.9913
0.0578 36.9231 960 0.0368 0.9520 0.9770 0.9894 nan 0.9916 0.9443 0.9951 nan 0.9852 0.8790 0.9917
0.0471 37.6923 980 0.0369 0.9517 0.9779 0.9893 nan 0.9912 0.9480 0.9947 nan 0.9852 0.8786 0.9915
0.0388 38.4615 1000 0.0369 0.9511 0.9776 0.9892 nan 0.9904 0.9473 0.9952 nan 0.9846 0.8770 0.9916
0.0455 39.2308 1020 0.0367 0.9517 0.9753 0.9894 nan 0.9928 0.9379 0.9950 nan 0.9853 0.8784 0.9915
0.0359 40.0 1040 0.0360 0.9516 0.9773 0.9893 nan 0.9917 0.9457 0.9945 nan 0.9853 0.8783 0.9913
0.0281 40.7692 1060 0.0363 0.9519 0.9775 0.9894 nan 0.9917 0.9462 0.9946 nan 0.9854 0.8790 0.9913
0.0394 41.5385 1080 0.0367 0.9508 0.9769 0.9891 nan 0.9922 0.9446 0.9939 nan 0.9854 0.8761 0.9909
0.0286 42.3077 1100 0.0360 0.9525 0.9761 0.9896 nan 0.9924 0.9405 0.9953 nan 0.9855 0.8804 0.9917
0.028 43.0769 1120 0.0363 0.9509 0.9791 0.9891 nan 0.9909 0.9530 0.9936 nan 0.9850 0.8767 0.9911
0.0523 43.8462 1140 0.0366 0.9526 0.9777 0.9895 nan 0.9919 0.9466 0.9947 nan 0.9856 0.8806 0.9915
0.0492 44.6154 1160 0.0364 0.9523 0.9764 0.9895 nan 0.9926 0.9419 0.9948 nan 0.9856 0.8799 0.9915
0.0331 45.3846 1180 0.0356 0.9523 0.9781 0.9894 nan 0.9906 0.9484 0.9954 nan 0.9852 0.8799 0.9917
0.0443 46.1538 1200 0.0358 0.9533 0.9772 0.9897 nan 0.9921 0.9443 0.9953 nan 0.9857 0.8824 0.9918
0.0331 46.9231 1220 0.0356 0.9527 0.9771 0.9896 nan 0.9929 0.9441 0.9943 nan 0.9858 0.8808 0.9915
0.0546 47.6923 1240 0.0357 0.9532 0.9774 0.9897 nan 0.9916 0.9450 0.9956 nan 0.9856 0.8821 0.9919
0.0297 48.4615 1260 0.0351 0.9526 0.9776 0.9896 nan 0.9925 0.9461 0.9942 nan 0.9857 0.8807 0.9915
0.053 49.2308 1280 0.0349 0.9527 0.9779 0.9896 nan 0.9921 0.9471 0.9945 nan 0.9856 0.8809 0.9916
0.0474 50.0 1300 0.0361 0.9518 0.9783 0.9893 nan 0.9923 0.9490 0.9935 nan 0.9857 0.8788 0.9911

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.0+cpu
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cephelos/dungeon-maps-seg-v0.0.1

Base model

nvidia/mit-b0
Finetuned
(311)
this model