Edit model card

segformer-b0-finetuned-ade-512-512

This model is a fine-tuned version of nvidia/segformer-b0-finetuned-ade-512-512 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2697
  • Mean Iou: 0.4348
  • Mean Accuracy: 0.6256
  • Overall Accuracy: 0.9027
  • Per Category Iou: [0.8721567311144488, 0.9302552083535434, 0.9691448438428073, 0.8377177146634799, 0.9083280802736413, 0.8822213720238972, 0.0, 0.8569897889829763, 0.8560781290156773, 0.939549448793737, 0.5462190227431993, 0.912755679212401, 0.565392030848329, 0.5326531383441031, 0.8369071057812779, 0.6852501836584769, 0.33327694129588903, 0.583029197080292, 0.9240404935578431, 0.0, 0.21189945911549474, nan, 0.9584450402144772, 0.0, 0.8232311974167744, nan, nan, 0.6102189781021898, 0.0, 0.7157043030525929, nan, 0.0, 0.7543182490387459, nan, nan, 0.837372163415901, 0.13758647194465795, 0.6856352684744651, nan, 0.1457142857142857, nan, 0.0, nan, nan, nan, nan, nan, 0.6717817561807332, nan, nan, nan, 0.16240573845870884, 0.0, 0.4673176023867643, nan, nan, nan, 0.7908935546875, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.7340108638514106, nan, 0.0, nan, 0.3986013986013986, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.013386880856760375, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.6075484301937207, 0.21631205673758866, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
  • Per Category Accuracy: [0.9114318888713748, 0.9720078981191286, 0.9888312471245221, 0.8799463698132558, 0.9243177874756822, 0.8891030964527228, nan, 0.8899813161718912, 0.9926098098813994, 0.9802112225304033, 0.7967290805185151, 0.9306243960223199, 0.8302269617326475, 0.9757711690757047, 0.9492082825822168, 0.8963271407217596, 0.3925866879234755, 0.8116003386960203, 0.9764087233697708, 0.0, 0.22237061769616026, nan, 0.9919730452878803, nan, 0.9868529546495648, nan, nan, 0.8733459357277883, nan, 0.7227955348538387, nan, nan, 0.8357282126062949, nan, nan, 0.8559371681739958, 0.17063870352716873, 0.9793773016404419, nan, 0.18848996832101372, nan, 0.0, nan, nan, nan, nan, nan, 0.7887887887887888, nan, nan, nan, 0.16242067506667893, nan, 0.4818232662192394, nan, nan, nan, 0.8469281045751634, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.8599876822007801, nan, nan, nan, 0.4050532964863798, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.013386880856760375, nan, nan, nan, nan, nan, nan, nan, nan, 0.9163727959697733, 0.23461538461538461, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
0.5733 1.0 20 0.2368 0.4427 0.6352 0.9236 [0.8773377908022386, 0.9437577158673618, 0.9662144228817661, 0.8027064613300353, 0.9042265229777269, 0.8597318187141352, 0.0, 0.8622456251484678, 0.846403242147923, 0.9364798676928948, 0.6075610004411715, 0.936774470564782, 0.5554919362534593, 0.9290467442335171, 0.8421452336178215, 0.685973597359736, 0.30665145092924684, 0.5577424023154848, 0.9187954344724506, 0.0, 0.24065864471184295, nan, 0.9583133684714902, 0.0, 0.8236178609042114, nan, nan, 0.7262547649301143, 0.0, 0.9548878376020319, nan, 0.0, 0.7492836888908684, nan, nan, 0.8462325815256428, 0.022851919561243144, 0.6843559072075657, nan, 0.06228104320747373, nan, 0.0, nan, nan, 0.0, nan, nan, 0.6497277676950998, nan, nan, nan, 0.41484952243647505, 0.0, 0.2722007722007722, nan, nan, nan, 0.8282410225197809, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.7525243881567688, nan, nan, nan, 0.4959970887918486, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.07417218543046358, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.5867088607594937, 0.30824372759856633, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9205431218405401, 0.9732404588044973, 0.9818190471657703, 0.8355465578394086, 0.9379252326620747, 0.8709514770435401, nan, 0.9042391529997924, 0.9957685201740271, 0.9815980371239599, 0.8318078334804627, 0.9521493812151252, 0.8239984900674752, 0.9452151520186077, 0.9454730004060089, 0.8876788383514841, 0.3748505380629733, 0.8158340389500424, 0.9741756559635506, 0.0, 0.25375626043405675, nan, 0.9909820632246557, nan, 0.9911207817987479, nan, nan, 0.9098597154511989, nan, 0.9888019996787665, nan, nan, 0.8269460783510559, nan, nan, 0.86274761085277, 0.023832221163012392, 0.9787412119183126, nan, 0.08447729672650475, nan, 0.0, nan, nan, nan, nan, nan, 0.7167167167167167, nan, nan, nan, 0.42343419479444494, nan, 0.276006711409396, nan, nan, nan, 0.8894117647058823, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.9026893861630055, nan, nan, nan, 0.5380971180418476, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0749665327978581, nan, nan, nan, nan, nan, nan, nan, nan, 0.9340050377833753, 0.33076923076923076, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.7425 2.0 40 0.2664 0.4357 0.6322 0.9019 [0.8638591376724338, 0.9300467488312792, 0.9682946097089561, 0.8350347562949224, 0.9067329315000976, 0.8466577559743255, 0.0, 0.8611479747931882, 0.8553975668600174, 0.9370802381194667, 0.5595222765902652, 0.8873256686211173, 0.5523039200076426, 0.5531777931294354, 0.8374696904863785, 0.7310668637509782, 0.36622248839646654, 0.5436634717784877, 0.9067661874717587, 0.0, 0.2299905093324897, nan, 0.9553733193477639, 0.0, 0.8209253370925021, nan, nan, 0.5831057354662504, 0.0, 0.7267762544393961, nan, 0.0, 0.747901056242665, nan, nan, 0.8665663406069727, 0.12490650710545999, 0.6865979865299322, nan, 0.08217954443948192, nan, 0.0, nan, nan, nan, nan, nan, 0.6993299832495813, nan, nan, nan, 0.03228179895153132, 0.0, 0.70817843866171, nan, nan, nan, 0.8405745489078822, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.7360556038227628, nan, 0.0, nan, 0.4035225048923679, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.030789825970548863, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.5877525252525253, 0.15671641791044777, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.907848670057478, 0.9760096658706379, 0.9839512636238161, 0.8681893936156575, 0.9285872022714128, 0.8548315339897972, nan, 0.8998629852605357, 0.993146194648072, 0.9823647855771283, 0.8075547089160433, 0.9034103307459709, 0.8183834284905394, 0.9756050285207952, 0.9535525781567195, 0.8977151398676062, 0.4874451972897569, 0.8645215918712955, 0.9757424048695277, 0.0, 0.24273789649415692, nan, 0.9928649291447825, nan, 0.9914719804550313, nan, nan, 0.8921500348224057, nan, 0.7354290475425634, nan, nan, 0.8144282601215735, nan, nan, 0.8855406246567317, 0.15919923736892277, 0.9795112152661534, nan, 0.09714889123548047, nan, 0.0, nan, nan, nan, nan, nan, 0.8358358358358359, nan, nan, nan, 0.03228179895153132, nan, 0.7458053691275168, nan, nan, nan, 0.9256209150326797, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.8696366249230137, nan, nan, nan, 0.4070272404263719, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.030789825970548863, nan, nan, nan, nan, nan, nan, nan, nan, 0.9380352644836272, 0.16153846153846155, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.3789 3.0 60 0.2691 0.4208 0.6308 0.9015 [0.8628878536040647, 0.9279374095541746, 0.9693207425021476, 0.8362947144710895, 0.9070959358983613, 0.8821956844254469, 0.0, 0.8566474541410566, 0.8592087594256792, 0.9393436554637938, 0.5304325955734407, 0.9238570113531758, 0.5549556988154594, 0.5306310767014765, 0.8356795769615549, 0.7222075984908309, 0.33926218708827405, 0.5635422656699253, 0.9228108439688517, 0.0, 0.1277504105090312, nan, 0.9539135402780423, 0.0, 0.8302762586791642, nan, nan, 0.5501872429246731, 0.0, 0.7125915625853549, nan, 0.0, 0.7435711260361832, nan, nan, 0.85118087660825, 0.17572335920959775, 0.6882618722643197, 0.0, 0.10651408450704225, nan, 0.0, nan, nan, nan, nan, nan, 0.6826843413421707, nan, nan, nan, 0.14218706888623195, 0.0, 0.5952, nan, nan, nan, 0.8127490039840638, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.7389789161874891, nan, 0.0, nan, 0.41311987504880904, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.01606425702811245, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.6058727569331158, 0.272108843537415, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9075156571565582, 0.9723363106701643, 0.9881649294813828, 0.8744177599247895, 0.9249907986750092, 0.8906602206667458, nan, 0.8923147187045879, 0.991477442040646, 0.9813180072541071, 0.7839055893695117, 0.9385735216185043, 0.8245647147643089, 0.9792739657750457, 0.9496142915144133, 0.8788169976510783, 0.41052212036667995, 0.8298052497883149, 0.9752921896666606, 0.0, 0.1298831385642738, nan, 0.99276583093846, nan, 0.9887311039853413, nan, nan, 0.8916525718833946, nan, 0.7202206472855767, nan, nan, 0.8054003571838186, nan, nan, 0.8696312840979825, 0.23736892278360344, 0.9791429527954469, nan, 0.12777191129883844, nan, 0.0, nan, nan, nan, nan, nan, 0.8248248248248248, nan, nan, nan, 0.14218706888623195, nan, 0.6241610738255033, nan, nan, nan, 0.88, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.8706631081913365, nan, nan, nan, 0.4176865377023293, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.01606425702811245, nan, nan, nan, nan, nan, nan, nan, nan, 0.9355163727959698, 0.3076923076923077, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.1984 4.0 80 0.2697 0.4348 0.6256 0.9027 [0.8721567311144488, 0.9302552083535434, 0.9691448438428073, 0.8377177146634799, 0.9083280802736413, 0.8822213720238972, 0.0, 0.8569897889829763, 0.8560781290156773, 0.939549448793737, 0.5462190227431993, 0.912755679212401, 0.565392030848329, 0.5326531383441031, 0.8369071057812779, 0.6852501836584769, 0.33327694129588903, 0.583029197080292, 0.9240404935578431, 0.0, 0.21189945911549474, nan, 0.9584450402144772, 0.0, 0.8232311974167744, nan, nan, 0.6102189781021898, 0.0, 0.7157043030525929, nan, 0.0, 0.7543182490387459, nan, nan, 0.837372163415901, 0.13758647194465795, 0.6856352684744651, nan, 0.1457142857142857, nan, 0.0, nan, nan, nan, nan, nan, 0.6717817561807332, nan, nan, nan, 0.16240573845870884, 0.0, 0.4673176023867643, nan, nan, nan, 0.7908935546875, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.7340108638514106, nan, 0.0, nan, 0.3986013986013986, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.013386880856760375, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.6075484301937207, 0.21631205673758866, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9114318888713748, 0.9720078981191286, 0.9888312471245221, 0.8799463698132558, 0.9243177874756822, 0.8891030964527228, nan, 0.8899813161718912, 0.9926098098813994, 0.9802112225304033, 0.7967290805185151, 0.9306243960223199, 0.8302269617326475, 0.9757711690757047, 0.9492082825822168, 0.8963271407217596, 0.3925866879234755, 0.8116003386960203, 0.9764087233697708, 0.0, 0.22237061769616026, nan, 0.9919730452878803, nan, 0.9868529546495648, nan, nan, 0.8733459357277883, nan, 0.7227955348538387, nan, nan, 0.8357282126062949, nan, nan, 0.8559371681739958, 0.17063870352716873, 0.9793773016404419, nan, 0.18848996832101372, nan, 0.0, nan, nan, nan, nan, nan, 0.7887887887887888, nan, nan, nan, 0.16242067506667893, nan, 0.4818232662192394, nan, nan, nan, 0.8469281045751634, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.8599876822007801, nan, nan, nan, 0.4050532964863798, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.013386880856760375, nan, nan, nan, nan, nan, nan, nan, nan, 0.9163727959697733, 0.23461538461538461, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2+cpu
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
8
Safetensors
Model size
3.75M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for Hemg/segformer-b0-finetuned-ade-512-512

Finetuned
(33)
this model

Dataset used to train Hemg/segformer-b0-finetuned-ade-512-512