segformer-b0-scene-parse-150
This model is a fine-tuned version of nvidia/mit-b0 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:
- Loss: 3.1233
- Mean Iou: 0.0499
- Mean Accuracy: 0.1002
- Overall Accuracy: 0.4411
- Per Category Iou: [0.48558573747771877, 0.13040346143692588, 0.9710373374855027, 0.35917887889695804, 0.0, 0.45950935394480524, 0.0, 0.03469888477930799, 0.14258438038925844, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.012871409962161157, 0.00034894270360806757, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan]
- Per Category Accuracy: [0.8242440823423323, 0.8852976913730255, 0.9952386581553576, 0.9823774713471163, nan, 0.5796817673875851, nan, 0.03478226494157777, 0.28032290615539857, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.01971402783715738, 0.009380863039399626, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
---|---|---|---|---|---|---|---|---|
2.7421 | 1.0 | 20 | 3.3833 | 0.0463 | 0.0951 | 0.4139 | [0.5042443160472232, 0.05850488171016955, 0.8878082320584848, 0.3552059557553269, 0.0, 0.4974007656659279, 0.0, 0.0033347828296804143, 0.10203485541816766, nan, 0.0, 0.0, 0.0, nan, 0.00024932681759250023, 0.0, nan, 0.0, 0.0, 0.001161751563896336, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] | [0.7513188848470885, 0.9163122721749696, 0.9921519951664171, 0.9314398624508411, nan, 0.6103084423611317, nan, 0.0033347828296804143, 0.16920282542885973, nan, 0.0, nan, 0.0, nan, 0.00025600655376777645, 0.0, nan, 0.0, 0.0, 0.0012350957199182937, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] |
3.338 | 2.0 | 40 | 3.1824 | 0.0462 | 0.0946 | 0.4332 | [0.4686670326604448, 0.13915730032570722, 0.9648828898840046, 0.3274356365713295, 0.0, 0.4190856605515857, 0.0, 0.027812371447731, 0.09790729038808323, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 3.951632024025923e-05, 0.0012555534093104115, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] | [0.8435733274776226, 0.8578675577156744, 0.9947986444952321, 0.9834918033414642, nan, 0.4892109453031517, nan, 0.027872685276693794, 0.13039354187689203, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 4.750368153531899e-05, 0.024390243902439025, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] |
2.8348 | 3.0 | 60 | 3.2355 | 0.0479 | 0.0989 | 0.4308 | [0.4977587888816171, 0.09677729869319446, 0.9652496798975673, 0.3337000632188886, 0.0, 0.4557538169132082, 0.0, 0.022702263454603938, 0.16553768864913834, nan, 0.0, 0.0, 0.0, nan, 8.51032789077602e-05, 0.0, nan, 0.0, 0.0, 0.0005362313659600329, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] | [0.7980985739304478, 0.8856622114216282, 0.9901752173798828, 0.9884163083061771, nan, 0.5406592102530632, nan, 0.022719443393402722, 0.3245206861755802, nan, 0.0, nan, 0.0, nan, 8.960229381872175e-05, 0.0, nan, 0.0, 0.0, 0.0007125552230297848, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] |
2.9176 | 4.0 | 80 | 3.0589 | 0.0482 | 0.0988 | 0.4382 | [0.4815405728526445, 0.120691916707657, 0.9705470746080861, 0.348264306570234, 0.0, 0.45544011136441587, 0.0, 0.03298657094071039, 0.1403600808116887, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0036486576956550848, 0.0, nan, 0.0, 0.00011733708812015317, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] | [0.8235601161302631, 0.8944106925880924, 0.9928941077573752, 0.9833692651152817, nan, 0.5581135471365988, nan, 0.033056987795993746, 0.2537840565085772, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.005177901287349769, 0.0, nan, 0.0, 0.00011874561450855509, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] |
2.1008 | 5.0 | 100 | 3.1233 | 0.0499 | 0.1002 | 0.4411 | [0.48558573747771877, 0.13040346143692588, 0.9710373374855027, 0.35917887889695804, 0.0, 0.45950935394480524, 0.0, 0.03469888477930799, 0.14258438038925844, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.012871409962161157, 0.00034894270360806757, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] | [0.8242440823423323, 0.8852976913730255, 0.9952386581553576, 0.9823774713471163, nan, 0.5796817673875851, nan, 0.03478226494157777, 0.28032290615539857, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.01971402783715738, 0.009380863039399626, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan] |
Framework versions
- Transformers 4.38.1
- Pytorch 2.2.1+cu118
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for ichigoDifuku/segformer-b0-scene-parse-150
Base model
nvidia/mit-b0