Edit model card

segformer-b0-scene-parse-150-MASKED5

This model is a fine-tuned version of nvidia/mit-b0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 4.4715
  • Mean Iou: 0.0161
  • Mean Accuracy: 0.0620
  • Overall Accuracy: 0.2198
  • Per Category Iou: [0.26811535900844996, 0.13361073012813385, 0.08961412470909454, 0.1128982647416514, 0.0583985070987828, 0.07100614244377887, 0.19794722756159697, 0.012499372941156798, 0.13135492375006866, 0.07853702343048412, 0.0, 0.0, 0.000257651881995438, nan, 0.00015613885949237522, 0.00816753033501802, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0]
  • Per Category Accuracy: [0.5251087767953342, 0.3576830362495917, 0.09867066511502999, 0.24665550110621756, 0.14805877178383775, 0.08033983486897212, 0.8465623033354311, 0.014047451256753583, 0.38394991259676786, 0.20428242788886922, 0.0, 0.0, 0.0002825539553398542, nan, 0.00015841165909810963, 0.0103471565581521, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
4.8994 1.0 20 4.9538 0.0033 0.0158 0.0601 [0.09446474142430365, 0.09207604302632935, 0.0, 0.0027654472166947293, 0.00022686472943915094, 0.024674639716049208, 0.05004409798287291, 0.0, 0.0659773981315997, 0.000611224966742171, 0.0033934631184140023, 0.0, 0.00017743332055813424, 0.0, 0.0, 0.0035134839707053516, nan, 0.0016311966356569389, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.014299143136049748, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0] [0.11622029935531653, 0.15137832797879686, 0.0, 0.0028017285941732317, 0.00027341242481158316, 0.024855809501017113, 0.069186595342983, 0.0, 0.3015857443544647, 0.0018049902672093434, 0.0074461136512083605, 0.0, 0.0002077602612793046, nan, 0.0, 0.006078616776982304, nan, 0.003389291955727374, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.057269041413263826, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.8245 2.0 40 4.7740 0.0083 0.0441 0.1807 [0.2585690813978081, 0.14108215993332457, 0.03419684396156098, 0.023996082272282077, 0.027117104130902927, 0.062083130688529144, 0.14413986579778207, 0.003936397464167586, 0.08274324585033332, 0.024486949334228778, 5.267038870746866e-05, 0.0, 0.0028745859227420813, 0.0, 8.715812226541392e-05, 0.004454446326241793, 0.0, 0.011132801332995728, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0] [0.48883868337236625, 0.37235736052362817, 0.03638140423119386, 0.02735562310030395, 0.07379757970566558, 0.06843125523513223, 0.6446664568911264, 0.00429410382898755, 0.20655702614962007, 0.06710316758095912, 8.708904855214457e-05, 0.0, 0.003490372389492317, nan, 8.800647727672756e-05, 0.0051870863163582335, nan, 0.07218132711963142, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5392 3.0 60 4.5845 0.0136 0.0563 0.2071 [0.2557528891910265, 0.13708037604006187, 0.08692430869243087, 0.09462758833831256, 0.03757492466907787, 0.07904498578899012, 0.1908769498813005, 0.0028976120267943887, 0.1151440589593073, 0.045923559494518804, 0.0, 0.0, 0.00032954989615346296, nan, 0.0, 0.010276481299315599, nan, 0.0010769035854554668, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0] [0.47425697914676446, 0.38188388718673894, 0.09495941873302897, 0.18992825093563262, 0.09620551104348446, 0.08812253200909417, 0.7368628067967276, 0.0031947380784590087, 0.43822910349256183, 0.12617235887453548, 0.0, 0.0, 0.0003573476494004039, nan, 0.0, 0.013264892611103607, nan, 0.0018005613514801672, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.5062 4.0 80 4.4950 0.0154 0.0587 0.2151 [0.26153152576800204, 0.13111120499063553, 0.0587806904353667, 0.1163810590985491, 0.05045636959256171, 0.07593274224946232, 0.20581123895280973, 0.011514468896205164, 0.12641034098405404, 0.060098757566103854, 0.0, 0.0, 0.0001400405380504883, nan, 3.468970062788358e-05, 0.009957528094454266, nan, 0.00014301548142586434, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0] [0.5153390546612014, 0.38497256575329436, 0.062016978485960675, 0.2491160597977793, 0.1301799767005064, 0.08550915400263252, 0.7767463813719321, 0.012487667371388301, 0.39649851949627196, 0.1335338878074677, 0.0, 0.0, 0.00015789779857227148, nan, 3.5202590910691024e-05, 0.013237876536539241, nan, 0.00021183074723296087, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
4.2543 5.0 100 4.4715 0.0161 0.0620 0.2198 [0.26811535900844996, 0.13361073012813385, 0.08961412470909454, 0.1128982647416514, 0.0583985070987828, 0.07100614244377887, 0.19794722756159697, 0.012499372941156798, 0.13135492375006866, 0.07853702343048412, 0.0, 0.0, 0.000257651881995438, nan, 0.00015613885949237522, 0.00816753033501802, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0] [0.5251087767953342, 0.3576830362495917, 0.09867066511502999, 0.24665550110621756, 0.14805877178383775, 0.08033983486897212, 0.8465623033354311, 0.014047451256753583, 0.38394991259676786, 0.20428242788886922, 0.0, 0.0, 0.0002825539553398542, nan, 0.00015841165909810963, 0.0103471565581521, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.28.1
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
1
Unable to determine this model’s pipeline type. Check the docs .