Edit model card

segformer-webots-grasp

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.4405
  • Mean Iou: 0.0802
  • Mean Accuracy: 0.8142
  • Overall Accuracy: 0.8142
  • Per Category Iou: [0.8018109121927621, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0]
  • Per Category Accuracy: [0.8141941463133944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
5.4277 0.03 1 5.4055 0.0001 0.0011 0.0011 [0.0010665493208110168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.001069721346006343, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.3872 0.05 2 5.3965 0.0004 0.0053 0.0053 [0.005245532234607124, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.005252500148444879, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.3659 0.07 3 5.3862 0.0013 0.0160 0.0160 [0.01595653583377678, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.015969302211124733, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.2685 0.1 4 5.3741 0.0037 0.0445 0.0445 [0.04445613702342968, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.044479595103583916, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.2464 0.12 5 5.3603 0.0113 0.1359 0.1359 [0.13585817440138379, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.1359222528362154, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.1881 0.15 6 5.3461 0.0240 0.2880 0.2880 [0.287886510883373, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.2879980631769163, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.1556 0.17 7 5.3322 0.0329 0.3949 0.3949 [0.3948099641784779, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.3949236748463366, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.0952 0.2 8 5.3175 0.0400 0.4807 0.4807 [0.4805425893875233, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] [0.4807401186212317, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.0421 0.23 9 5.3015 0.0606 0.5455 0.5455 [0.5450016850554845, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.5454547402275828, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
5.0168 0.25 10 5.2843 0.0659 0.5936 0.5936 [0.5926648159557198, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.5935808761369806, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.9417 0.28 11 5.2641 0.0575 0.6341 0.6341 [0.6324597013631272, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.6340619477314868, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.8962 0.3 12 5.2394 0.0606 0.6696 0.6696 [0.6670764824873869, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.6695910817989438, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.9963 0.33 13 5.2132 0.0574 0.6918 0.6918 [0.6887950316615776, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.6918253700562499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.9677 0.35 14 5.1887 0.0703 0.7061 0.7061 [0.7026749604186757, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7060962792061474, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.921 0.38 15 5.1585 0.0799 0.7228 0.7228 [0.7188799212945968, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7228307612069762, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.9849 0.4 16 5.1298 0.0808 0.7311 0.7311 [0.7268635588232967, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7311038850932571, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.8045 0.42 17 5.0921 0.0823 0.7458 0.7458 [0.7409411768283134, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7458280867505752, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.6795 0.45 18 5.0554 0.0835 0.7575 0.7575 [0.7518766884727517, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7574526981157601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.7354 0.47 19 5.0164 0.0848 0.7698 0.7698 [0.7632263888994257, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7697950481236874, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.64 0.5 20 4.9783 0.0771 0.7779 0.7779 [0.7705429510590344, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7778955474494109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.6682 0.53 21 4.9336 0.0777 0.7858 0.7858 [0.7773627351835143, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7858173007762596, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.6887 0.55 22 4.8904 0.0782 0.7914 0.7914 [0.7821032704184515, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.791440760086753, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5486 0.57 23 4.8541 0.0785 0.7943 0.7943 [0.7845670293951555, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7943371186267411, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.6823 0.6 24 4.8202 0.0787 0.7972 0.7972 [0.786876415221171, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.797181751012945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5797 0.62 25 4.7832 0.0789 0.8001 0.8001 [0.7893493389876929, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8000900463576524, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5976 0.65 26 4.7564 0.0789 0.7999 0.7999 [0.7892736529850544, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.7998681442186382, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5417 0.68 27 4.7152 0.0793 0.8040 0.8040 [0.7927347074267436, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8040157247507503, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.608 0.7 28 4.6820 0.0794 0.8053 0.8053 [0.7938510702940416, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8053333643486208, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4955 0.72 29 4.6511 0.0795 0.8062 0.8062 [0.7946513690612669, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8061980175109864, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5359 0.75 30 4.6301 0.0794 0.8057 0.8057 [0.7942641352298992, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8056544337883806, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4116 0.78 31 4.5990 0.0797 0.8094 0.8094 [0.7974774354228353, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8094237094324631, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4995 0.8 32 4.5779 0.0798 0.8098 0.8098 [0.7978091719139048, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8097701828412411, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5087 0.82 33 4.5585 0.0799 0.8112 0.8112 [0.7989926828268309, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8112010690479878, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4534 0.85 34 4.5346 0.0799 0.8116 0.8116 [0.7994616833790221, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8116476279732591, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4347 0.88 35 4.5080 0.0800 0.8121 0.8121 [0.7999173944895492, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8121012265525958, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4389 0.9 36 4.4889 0.0800 0.8119 0.8119 [0.7998204784246967, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8119105437490014, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.428 0.93 37 4.4843 0.0799 0.8105 0.8105 [0.7987525641385264, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8105448508603376, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5333 0.95 38 4.4847 0.0798 0.8094 0.8094 [0.7978741861189972, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8093536189637262, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4458 0.97 39 4.4634 0.0798 0.8098 0.8098 [0.7981748026222505, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8097799771425493, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.4749 1.0 40 4.4405 0.0802 0.8142 0.8142 [0.8018109121927621, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] [0.8141941463133944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
3.77M params
Tensor type
F32
·
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from