Edit model card

segformer-b0-finetuned-robot-hiking

This model is a fine-tuned version of nvidia/mit-b0 on the twdent/Hiking dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1458
  • Mean Iou: 0.6158
  • Mean Accuracy: 0.9603
  • Overall Accuracy: 0.9619
  • Accuracy Unlabeled: nan
  • Accuracy Traversable: 0.9546
  • Accuracy Non-traversable: 0.9661
  • Iou Unlabeled: 0.0
  • Iou Traversable: 0.9044
  • Iou Non-traversable: 0.9429

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Traversable Accuracy Non-traversable Iou Unlabeled Iou Traversable Iou Non-traversable
0.5363 1.33 20 0.7433 0.5581 0.9246 0.9153 nan 0.9573 0.8919 0.0 0.8030 0.8712
0.3689 2.67 40 0.4666 0.5567 0.9281 0.9137 nan 0.9791 0.8771 0.0 0.8030 0.8670
0.2706 4.0 60 0.3204 0.5880 0.9479 0.9407 nan 0.9732 0.9226 0.0 0.8550 0.9089
0.2338 5.33 80 0.2881 0.5985 0.9504 0.9497 nan 0.9527 0.9481 0.0 0.8719 0.9236
0.2068 6.67 100 0.2556 0.9022 0.9521 0.9521 nan 0.9522 0.9521 nan 0.8770 0.9273
0.1764 8.0 120 0.2401 0.6024 0.9539 0.9528 nan 0.9577 0.9500 0.0 0.8792 0.9280
0.2639 9.33 140 0.2588 0.5937 0.9504 0.9455 nan 0.9680 0.9329 0.0 0.8646 0.9166
0.1813 10.67 160 0.2124 0.6030 0.9526 0.9530 nan 0.9513 0.9540 0.0 0.8801 0.9287
0.1407 12.0 180 0.1938 0.6055 0.9518 0.9554 nan 0.9388 0.9647 0.0 0.8836 0.9328
0.13 13.33 200 0.1881 0.6062 0.9524 0.9558 nan 0.9403 0.9644 0.0 0.8854 0.9333
0.107 14.67 220 0.2092 0.5967 0.9530 0.9474 nan 0.9725 0.9334 0.0 0.8708 0.9194
0.1282 16.0 240 0.1803 0.6065 0.9536 0.9555 nan 0.9471 0.9602 0.0 0.8869 0.9328
0.146 17.33 260 0.1912 0.6028 0.9559 0.9519 nan 0.9700 0.9418 0.0 0.8814 0.9269
0.1011 18.67 280 0.1769 0.6079 0.9598 0.9561 nan 0.9727 0.9468 0.0 0.8907 0.9330
0.1124 20.0 300 0.1580 0.6135 0.9582 0.9608 nan 0.9491 0.9673 0.0 0.8995 0.9411
0.0801 21.33 320 0.1614 0.6113 0.9582 0.9588 nan 0.9563 0.9602 0.0 0.8960 0.9380
0.0831 22.67 340 0.1540 0.6130 0.9608 0.9601 nan 0.9633 0.9584 0.0 0.8994 0.9396
0.0599 24.0 360 0.1641 0.6098 0.9584 0.9576 nan 0.9614 0.9554 0.0 0.8935 0.9358
0.0955 25.33 380 0.1711 0.6084 0.9597 0.9562 nan 0.9720 0.9474 0.0 0.8917 0.9334
0.0667 26.67 400 0.1618 0.6109 0.9574 0.9583 nan 0.9543 0.9605 0.0 0.8954 0.9373
0.0783 28.0 420 0.1640 0.6089 0.9589 0.9568 nan 0.9665 0.9513 0.0 0.8924 0.9343
0.0743 29.33 440 0.1512 0.6145 0.9582 0.9612 nan 0.9478 0.9686 0.0 0.9016 0.9419
0.0775 30.67 460 0.1574 0.6131 0.9583 0.9598 nan 0.9528 0.9637 0.0 0.8995 0.9398
0.0773 32.0 480 0.1464 0.6157 0.9610 0.9621 nan 0.9573 0.9647 0.0 0.9043 0.9428
0.0575 33.33 500 0.1600 0.6085 0.9568 0.9564 nan 0.9583 0.9554 0.0 0.8912 0.9343
0.0729 34.67 520 0.1540 0.6105 0.9569 0.9577 nan 0.9541 0.9597 0.0 0.8946 0.9369
0.1409 36.0 540 0.1557 0.6112 0.9584 0.9586 nan 0.9575 0.9593 0.0 0.8962 0.9376
0.0543 37.33 560 0.1607 0.6103 0.9546 0.9581 nan 0.9422 0.9670 0.0 0.8938 0.9373
0.063 38.67 580 0.1622 0.6099 0.9558 0.9574 nan 0.9504 0.9613 0.0 0.8933 0.9364
0.0549 40.0 600 0.1543 0.6118 0.9571 0.9590 nan 0.9503 0.9639 0.0 0.8969 0.9386
0.0766 41.33 620 0.1481 0.6139 0.9575 0.9606 nan 0.9467 0.9684 0.0 0.9005 0.9412
0.0616 42.67 640 0.1485 0.6151 0.9600 0.9614 nan 0.9550 0.9649 0.0 0.9032 0.9422
0.0872 44.0 660 0.1511 0.6144 0.9607 0.9609 nan 0.9602 0.9612 0.0 0.9022 0.9409
0.0677 45.33 680 0.1510 0.6139 0.9599 0.9604 nan 0.9580 0.9618 0.0 0.9012 0.9405
0.1075 46.67 700 0.1506 0.6145 0.9606 0.9610 nan 0.9592 0.9619 0.0 0.9024 0.9411
0.0485 48.0 720 0.1450 0.6159 0.9595 0.9621 nan 0.9506 0.9685 0.0 0.9043 0.9433
0.0972 49.33 740 0.1458 0.6158 0.9603 0.9619 nan 0.9546 0.9661 0.0 0.9044 0.9429

Framework versions

  • Transformers 4.35.0.dev0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
2

Finetuned from