Edit model card

segformer-b0-finetuned-segments-greenhouse-jun-24

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6502
  • Mean Iou: 0.3640
  • Mean Accuracy: 0.4319
  • Overall Accuracy: 0.8283
  • Accuracy Unlabeled: nan
  • Accuracy Object: 0.0
  • Accuracy Road: 0.9324
  • Accuracy Plant: 0.8871
  • Accuracy Iron: 0.0017
  • Accuracy Wood: nan
  • Accuracy Wall: 0.7226
  • Accuracy Raw Road: 0.9465
  • Accuracy Bottom Wall: 0.0
  • Accuracy Roof: 0.0
  • Accuracy Grass: nan
  • Accuracy Mulch: 0.8289
  • Accuracy Person: nan
  • Accuracy Tomato: 0.0
  • Iou Unlabeled: nan
  • Iou Object: 0.0
  • Iou Road: 0.7525
  • Iou Plant: 0.7027
  • Iou Iron: 0.0017
  • Iou Wood: nan
  • Iou Wall: 0.5584
  • Iou Raw Road: 0.8998
  • Iou Bottom Wall: 0.0
  • Iou Roof: 0.0
  • Iou Grass: nan
  • Iou Mulch: 0.7252
  • Iou Person: nan
  • Iou Tomato: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Object Accuracy Road Accuracy Plant Accuracy Iron Accuracy Wood Accuracy Wall Accuracy Raw Road Accuracy Bottom Wall Accuracy Roof Accuracy Grass Accuracy Mulch Accuracy Person Accuracy Tomato Iou Unlabeled Iou Object Iou Road Iou Plant Iou Iron Iou Wood Iou Wall Iou Raw Road Iou Bottom Wall Iou Roof Iou Grass Iou Mulch Iou Person Iou Tomato
1.9416 1.05 20 2.3650 0.1880 0.3464 0.6650 nan 0.0 0.7192 0.7931 0.2656 nan 0.0681 0.8201 0.0 0.0 nan 0.7950 nan 0.0029 nan 0.0 0.4874 0.5054 0.1242 0.0 0.0676 0.8065 0.0 0.0 0.0 0.4498 0.0 0.0027
1.4047 2.11 40 1.6208 0.2889 0.3699 0.7203 nan 0.0 0.7452 0.8135 0.0384 nan 0.4353 0.8655 0.0 0.0 nan 0.8014 nan 0.0 nan 0.0 0.4970 0.5407 0.0371 nan 0.4041 0.8614 0.0 0.0 nan 0.5489 nan 0.0
1.4998 3.16 60 1.2645 0.3150 0.3936 0.7522 nan 0.0 0.7532 0.8121 0.0174 nan 0.6304 0.9056 0.0 0.0 nan 0.8171 nan 0.0 nan 0.0 0.5316 0.5644 0.0174 nan 0.5346 0.8961 0.0 0.0 nan 0.6057 nan 0.0
1.0844 4.21 80 1.1551 0.3234 0.4083 0.7685 nan 0.0 0.8290 0.7952 0.0230 nan 0.6585 0.9033 0.0 0.0 nan 0.8740 nan 0.0 nan 0.0 0.5971 0.5910 0.0229 nan 0.5307 0.8905 0.0 0.0 nan 0.6020 nan 0.0
1.2949 5.26 100 1.0333 0.3363 0.4129 0.7841 nan 0.0 0.8274 0.8389 0.0140 nan 0.7114 0.9133 0.0 0.0 nan 0.8243 nan 0.0 nan 0.0 0.6211 0.6125 0.0140 nan 0.5854 0.8890 0.0 0.0 nan 0.6410 nan 0.0
1.3389 6.32 120 0.9260 0.3417 0.4155 0.7932 nan 0.0 0.8668 0.8408 0.0 nan 0.7105 0.9202 0.0 0.0 nan 0.8164 nan 0.0 nan 0.0 0.6489 0.6214 0.0 nan 0.6039 0.8936 0.0 0.0 nan 0.6495 nan 0.0
0.7833 7.37 140 0.9264 0.3357 0.4075 0.7871 nan 0.0 0.8811 0.8468 0.0 nan 0.6389 0.9125 0.0 0.0 nan 0.7963 nan 0.0 nan 0.0 0.6176 0.6285 0.0 nan 0.5777 0.8915 0.0 0.0 nan 0.6419 nan 0.0
1.0194 8.42 160 0.8761 0.3499 0.4231 0.8038 nan 0.0 0.8549 0.8586 0.0 nan 0.7365 0.9299 0.0 0.0 nan 0.8508 nan 0.0 nan 0.0 0.6797 0.6342 0.0 nan 0.6119 0.8995 0.0 0.0 nan 0.6738 nan 0.0
0.5558 9.47 180 0.8468 0.3458 0.4174 0.7981 nan 0.0 0.8533 0.8817 0.0 nan 0.6946 0.9063 0.0 0.0 nan 0.8381 nan 0.0 nan 0.0 0.6659 0.6338 0.0 nan 0.6155 0.8865 0.0 0.0 nan 0.6564 nan 0.0
1.2579 10.53 200 0.7776 0.3502 0.4184 0.8047 nan 0.0 0.8678 0.8680 0.0 nan 0.6966 0.9388 0.0 0.0 nan 0.8131 nan 0.0 nan 0.0 0.6432 0.6556 0.0 nan 0.6191 0.8990 0.0 0.0 nan 0.6852 nan 0.0
0.7671 11.58 220 0.7935 0.3579 0.4276 0.8152 nan 0.0 0.8816 0.8768 0.0 nan 0.7413 0.9356 0.0 0.0 nan 0.8410 nan 0.0 nan 0.0 0.6987 0.6610 0.0 nan 0.6315 0.9022 0.0 0.0 nan 0.6857 nan 0.0
0.5097 12.63 240 0.7718 0.3549 0.4262 0.8129 nan 0.0 0.9047 0.8658 0.0 nan 0.7146 0.9298 0.0 0.0 nan 0.8467 nan 0.0 nan 0.0 0.6773 0.6707 0.0 nan 0.6172 0.9016 0.0 0.0 nan 0.6818 nan 0.0
0.624 13.68 260 0.7270 0.3609 0.4282 0.8228 nan 0.0 0.8772 0.9219 0.0004 nan 0.7225 0.9308 0.0 0.0 nan 0.8291 nan 0.0 nan 0.0 0.7310 0.6897 0.0004 nan 0.5916 0.8975 0.0 0.0 nan 0.6988 nan 0.0
0.535 14.74 280 0.7681 0.3526 0.4243 0.8085 nan 0.0 0.9574 0.8230 0.0009 nan 0.7059 0.9289 0.0 0.0 nan 0.8268 nan 0.0 nan 0.0 0.6786 0.6512 0.0009 nan 0.6011 0.9014 0.0 0.0 nan 0.6930 nan 0.0
0.6093 15.79 300 0.6960 0.3636 0.4349 0.8257 nan 0.0 0.9296 0.8704 0.0102 nan 0.7227 0.9435 0.0 0.0 nan 0.8722 nan 0.0 nan 0.0 0.7270 0.6943 0.0102 nan 0.5991 0.9034 0.0 0.0 nan 0.7024 nan 0.0
0.5584 16.84 320 0.6886 0.3671 0.4368 0.8281 nan 0.0 0.9186 0.8889 0.0157 nan 0.7333 0.9371 0.0 0.0 nan 0.8739 nan 0.0 nan 0.0 0.7428 0.6928 0.0157 nan 0.6008 0.9040 0.0 0.0 nan 0.7148 nan 0.0
0.4421 17.89 340 0.6946 0.3644 0.4336 0.8238 nan 0.0 0.9061 0.8956 0.0308 nan 0.7280 0.9336 0.0 0.0 nan 0.8422 nan 0.0 nan 0.0 0.7217 0.6974 0.0308 nan 0.5717 0.9021 0.0 0.0 nan 0.7199 nan 0.0
0.7997 18.95 360 0.7025 0.3580 0.4266 0.8172 nan 0.0 0.8983 0.8901 0.0075 nan 0.6955 0.9330 0.0 0.0 nan 0.8415 nan 0.0 nan 0.0 0.7140 0.6754 0.0075 nan 0.5592 0.9020 0.0 0.0 nan 0.7216 nan 0.0
0.8388 20.0 380 0.6959 0.3632 0.4366 0.8242 nan 0.0 0.9513 0.8467 0.0120 nan 0.7460 0.9393 0.0 0.0 nan 0.8710 nan 0.0 nan 0.0 0.7218 0.6943 0.0120 nan 0.5799 0.9040 0.0 0.0 nan 0.7199 nan 0.0
0.6424 21.05 400 0.6728 0.3651 0.4285 0.8280 nan 0.0 0.8680 0.9419 0.0007 nan 0.7148 0.9412 0.0 0.0 nan 0.8186 nan 0.0 nan 0.0 0.7527 0.6967 0.0007 nan 0.5737 0.9026 0.0 0.0 nan 0.7249 nan 0.0
0.3287 22.11 420 0.6786 0.3621 0.4314 0.8247 nan 0.0 0.9357 0.8771 0.0053 nan 0.7122 0.9410 0.0 0.0 nan 0.8427 nan 0.0 nan 0.0 0.7335 0.6949 0.0053 nan 0.5626 0.9025 0.0 0.0 nan 0.7222 nan 0.0
0.386 23.16 440 0.6603 0.3667 0.4354 0.8295 nan 0.0 0.9165 0.9030 0.0122 nan 0.7266 0.9361 0.0 0.0 nan 0.8593 nan 0.0 nan 0.0 0.7526 0.7050 0.0122 nan 0.5635 0.9033 0.0 0.0 nan 0.7301 nan 0.0
0.3378 24.21 460 0.6791 0.3644 0.4331 0.8265 nan 0.0 0.9426 0.8772 0.0103 nan 0.7197 0.9405 0.0 0.0 nan 0.8403 nan 0.0 nan 0.0 0.7441 0.6939 0.0103 nan 0.5636 0.9039 0.0 0.0 nan 0.7284 nan 0.0
0.3678 25.26 480 0.6915 0.3633 0.4342 0.8227 nan 0.0 0.9479 0.8577 0.0234 nan 0.7165 0.9384 0.0 0.0 nan 0.8579 nan 0.0 nan 0.0 0.7171 0.6910 0.0234 nan 0.5647 0.9051 0.0 0.0 nan 0.7320 nan 0.0
0.328 26.32 500 0.6879 0.3662 0.4360 0.8259 nan 0.0 0.9434 0.8741 0.0266 nan 0.7189 0.9346 0.0 0.0 nan 0.8627 nan 0.0 nan 0.0 0.7357 0.6927 0.0266 nan 0.5712 0.9042 0.0 0.0 nan 0.7316 nan 0.0
0.8502 27.37 520 0.6593 0.3644 0.4332 0.8270 nan 0.0 0.9414 0.8739 0.0066 nan 0.7263 0.9446 0.0 0.0 nan 0.8390 nan 0.0 nan 0.0 0.7449 0.6962 0.0066 nan 0.5647 0.9020 0.0 0.0 nan 0.7294 nan 0.0
0.3528 28.42 540 0.6777 0.3626 0.4305 0.8238 nan 0.0 0.9439 0.8717 0.0114 nan 0.7046 0.9429 0.0 0.0 nan 0.8307 nan 0.0 nan 0.0 0.7364 0.6872 0.0114 nan 0.5563 0.9029 0.0 0.0 nan 0.7320 nan 0.0
0.5908 29.47 560 0.6502 0.3640 0.4319 0.8283 nan 0.0 0.9324 0.8871 0.0017 nan 0.7226 0.9465 0.0 0.0 nan 0.8289 nan 0.0 nan 0.0 0.7525 0.7027 0.0017 nan 0.5584 0.8998 0.0 0.0 nan 0.7252 nan 0.0

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1
  • Datasets 2.15.0
  • Tokenizers 0.13.3
Downloads last month
4
Inference API
This model can be loaded on Inference API (serverless).

Finetuned from