Edit model card

segformer-b0-winter

This model is a fine-tuned version of nvidia/mit-b0 on the johanhag/winter-test dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1441
  • Mean Iou: 0.8861
  • Mean Accuracy: 0.9456
  • Overall Accuracy: 0.9660
  • Accuracy Unlabeled: nan
  • Accuracy Object: nan
  • Accuracy Road: 0.9769
  • Accuracy Side walk: 0.8930
  • Accuracy Car: 0.9347
  • Accuracy Pedestrian: nan
  • Accuracy Other: 0.9779
  • Iou Unlabeled: nan
  • Iou Object: nan
  • Iou Road: 0.9197
  • Iou Side walk: 0.8250
  • Iou Car: 0.8319
  • Iou Pedestrian: nan
  • Iou Other: 0.9678

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Object Accuracy Road Accuracy Side walk Accuracy Car Accuracy Pedestrian Accuracy Other Iou Unlabeled Iou Object Iou Road Iou Side walk Iou Car Iou Pedestrian Iou Other
0.2207 4.0 20 0.2300 0.8576 0.9439 0.9583 nan nan 0.9738 0.8767 0.9564 nan 0.9686 nan nan 0.9018 0.8000 0.7670 nan 0.9617
0.1792 8.0 40 0.2126 0.8696 0.9457 0.9614 nan nan 0.9768 0.8911 0.9444 nan 0.9706 nan nan 0.9106 0.8122 0.7924 nan 0.9633
0.1527 12.0 60 0.1869 0.8769 0.9470 0.9634 nan nan 0.9776 0.9023 0.9364 nan 0.9718 nan nan 0.9180 0.8165 0.8085 nan 0.9647
0.1329 16.0 80 0.1787 0.8783 0.9429 0.9634 nan nan 0.9772 0.8880 0.9314 nan 0.9749 nan nan 0.9126 0.8117 0.8229 nan 0.9661
0.1746 20.0 100 0.1651 0.8864 0.9511 0.9668 nan nan 0.9771 0.9126 0.9395 nan 0.9751 nan nan 0.9258 0.8369 0.8157 nan 0.9671
0.1218 24.0 120 0.1652 0.8798 0.9444 0.9643 nan nan 0.9791 0.8858 0.9370 nan 0.9757 nan nan 0.9140 0.8156 0.8224 nan 0.9673
0.0816 28.0 140 0.1473 0.8921 0.9521 0.9684 nan nan 0.9723 0.9199 0.9383 nan 0.9780 nan nan 0.9299 0.8478 0.8231 nan 0.9676
0.0893 32.0 160 0.1490 0.8892 0.9502 0.9672 nan nan 0.9749 0.9140 0.9354 nan 0.9766 nan nan 0.9260 0.8384 0.825 nan 0.9673
0.0849 36.0 180 0.1517 0.8861 0.9476 0.9660 nan nan 0.9791 0.8987 0.9367 nan 0.9760 nan nan 0.9205 0.8258 0.8308 nan 0.9674
0.1625 40.0 200 0.1519 0.8843 0.9468 0.9654 nan nan 0.9777 0.8938 0.9394 nan 0.9763 nan nan 0.9176 0.8235 0.8289 nan 0.9673
0.1396 44.0 220 0.1500 0.8850 0.9476 0.9655 nan nan 0.9791 0.8949 0.9408 nan 0.9757 nan nan 0.9187 0.8223 0.8317 nan 0.9674
0.0931 48.0 240 0.1441 0.8861 0.9456 0.9660 nan nan 0.9769 0.8930 0.9347 nan 0.9779 nan nan 0.9197 0.8250 0.8319 nan 0.9678

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for johanhag/segformer-b0-winter

Base model

nvidia/mit-b0
Finetuned
this model