Edit model card

segformer-b0-scene-parse-150

This model is a fine-tuned version of nvidia/mit-b0 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7158
  • Mean Iou: 0.0575
  • Mean Accuracy: 0.0995
  • Overall Accuracy: 0.4648
  • Per Category Iou: [0.44672496974409803, 0.5246878610396156, 0.2073942489175086, 0.4461580147251187, 0.6709173669159216, 0.35982779947389176, 0.0005154694530654325, 0.009501153711522114, 0.23323905377607992, 0.0, 0.023848147241266732, 0.0, 0.06428503562945369, 0.0, 0.0, 0.00526018196460086, 0.0, 0.0, 0.0004003660489590483, 0.2826172203237914, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
  • Per Category Accuracy: [0.8701105877534303, 0.7649097707689807, 0.20824275665250883, 0.6818336289049002, 0.9654490232009587, 0.49512427161374717, 0.006057546693589096, 0.01288659793814433, 0.4959889393146437, nan, 0.034012615588327307, nan, 0.06484693975349345, 0.0, 0.0, 0.00827783320300914, nan, 0.0, 0.0004003660489590483, 0.4684163288044319, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
2.8861 10.0 200 3.4518 0.0460 0.0871 0.4387 [0.3969301711292726, 0.407009124541566, 0.1858691819464034, 0.3487187527048191, 0.6198477877978043, 0.43618812656641603, 0.0, 0.1088497725164539, 0.05231273336889431, 0.0, 0.0, 0.0, 0.01404489007098984, 0.0, 0.0, 0.0001569283883454517, 0.0, 0.0, 0.0, 0.14669763591205962, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan] [0.9001446344460373, 0.6596260770606406, 0.18804276334124834, 0.609796983742136, 0.9662352814360626, 0.6622963491497206, 0.0, 0.191012324625998, 0.053624014810070224, nan, 0.0, nan, 0.014069658226149629, 0.0, 0.0, 0.0001617817564106021, nan, 0.0, 0.0, 0.19742502553310018, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
2.0228 20.0 400 3.0714 0.0521 0.0902 0.4319 [0.3908819659806409, 0.34176425750121264, 0.27734684694336714, 0.3467711453980972, 0.6652598893529553, 0.3993713022078525, 0.0, 0.11508504324411957, 0.16300110838512025, 0.0, 0.037551428372190325, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.18148929755803436, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan] [0.7808233497167042, 0.4810925052836937, 0.2885856660312364, 0.6733491542655118, 0.9645296083292647, 0.7610893090736116, 0.0, 0.15819510115494922, 0.2044742659407441, nan, 0.04701380148273178, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.24220853579276408, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
2.0541 30.0 600 2.8125 0.0606 0.1022 0.4683 [0.4354912810082317, 0.5136657316079992, 0.2571735614101172, 0.46600687018210146, 0.6816991679609, 0.46349720485077905, 0.003975688393168351, 0.015114196148678908, 0.14418364714985812, 0.0, 0.021026667032093622, nan, 0.012695499216091163, 0.0, 0.0, 0.0007345439706182412, 0.0, 0.0, 0.0, 0.31855511784736595, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan] [0.833117874940269, 0.922861323362055, 0.25877618362819527, 0.6713901002087563, 0.9657660628118877, 0.7062076346771317, 0.046062594649167087, 0.019620572048678397, 0.3056529788081643, nan, 0.02790853334691413, nan, 0.012727865207307022, 0.0, 0.0, 0.0009706905384636126, nan, 0.0, 0.0, 0.4429760762588592, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
1.9657 40.0 800 2.7501 0.0563 0.0985 0.4660 [0.4502025953819058, 0.5305299792942421, 0.20067731011127238, 0.47464834479446677, 0.6634585667585132, 0.3259851182020951, 0.0, 0.014531871786918676, 0.2514721268503095, 0.0, 0.03485342019543974, nan, 0.01199095889361376, 0.0, 0.0, 0.009941192943153179, 0.0, 0.0, 0.002573634543894767, 0.23698272648191873, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan] [0.8888362686872824, 0.7831246951715168, 0.20808668606401123, 0.6802372568673983, 0.9664445275792758, 0.40083541443691284, 0.0, 0.02133555538330362, 0.5200553034267815, nan, 0.054492939199266635, nan, 0.011999463282792463, 0.0, 0.0, 0.01340092215601154, nan, 0.0, 0.0025737817433081674, 0.47216118349788, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]
1.608 50.0 1000 2.7158 0.0575 0.0995 0.4648 [0.44672496974409803, 0.5246878610396156, 0.2073942489175086, 0.4461580147251187, 0.6709173669159216, 0.35982779947389176, 0.0005154694530654325, 0.009501153711522114, 0.23323905377607992, 0.0, 0.023848147241266732, 0.0, 0.06428503562945369, 0.0, 0.0, 0.00526018196460086, 0.0, 0.0, 0.0004003660489590483, 0.2826172203237914, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan] [0.8701105877534303, 0.7649097707689807, 0.20824275665250883, 0.6818336289049002, 0.9654490232009587, 0.49512427161374717, 0.006057546693589096, 0.01288659793814433, 0.4959889393146437, nan, 0.034012615588327307, nan, 0.06484693975349345, 0.0, 0.0, 0.00827783320300914, nan, 0.0, 0.0004003660489590483, 0.4684163288044319, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan]

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
0
Safetensors
Model size
3.75M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for alex-levashov/segformer-b0-scene-parse-150

Base model

nvidia/mit-b0
Finetuned
(305)
this model

Dataset used to train alex-levashov/segformer-b0-scene-parse-150