Edit model card

segformer-b0-human-parser

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2073
  • Mean Iou: 0.5123
  • Mean Accuracy: 0.6061
  • Overall Accuracy: 0.9404
  • Per Category Iou: [0.9738435807240893, 0.25487077790797996, 0.6992917234103969, 0.0, 0.6874674997812054, 0.640439429039686, 0.739829923873258, 0.5614734173142479, 0.0, 0.36041378832602766, 0.34524546132802786, 0.7459134523284406, 0.6752988298594533, 0.6595964688647477, 0.6534596510166254, 0.6718737447469826, 0.5531669206163902, 0.0]
  • Per Category Accuracy: [0.9881203940329015, 0.2634819419853832, 0.8494709222844186, 0.0, 0.8557345821246315, 0.7440582073040913, 0.8536968246833937, 0.7504572727657617, 0.0, 0.47917172993729645, 0.44030265431487925, 0.8766212774407773, 0.8252944601721314, 0.7851447983014862, 0.7807782580582752, 0.7778494044297343, 0.638974454957921, 0.0]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
1.196 1.0 150 1.0385 0.2589 0.3904 0.8476 [0.9204842413580886, 0.0, 0.5011135987109498, 0.0, 0.4247896643602946, 0.21716367546832283, 0.37654541592527674, 0.2413904123626465, 0.0, 0.00011498100258323986, 0.17434826190421535, 0.6403960678587253, 0.2886871534533113, 0.3498946119943481, 0.20843312477174916, 0.2948213980516151, 0.02278691191401758, 0.0] [0.9286047065013684, 0.0, 0.8187642509847288, 0.0, 0.8186861291885359, 0.2964611313525812, 0.8461503960175404, 0.32777406991076224, 0.0, 0.00011512278484574826, 0.28907191592259074, 0.7876296848770058, 0.5291312004696214, 0.6774522292993631, 0.35063191676043165, 0.33470760290184154, 0.02290101296194402, 0.0]
0.5993 2.0 300 0.5363 0.3513 0.4542 0.9036 [0.9606464787878808, 0.0, 0.6184389403979624, 0.0, 0.5766527077088294, 0.3976546071060847, 0.541258795941219, 0.2463616617645146, 0.0, 0.008046569553189534, 0.04380007870156896, 0.6770073675856182, 0.48097781602690326, 0.4815914821624524, 0.4298610883927723, 0.5332039845335002, 0.32718757935640846, 0.0] [0.9826711239263715, 0.0, 0.8494886765701763, 0.0, 0.7615903030599638, 0.6444578056184961, 0.8692773860198443, 0.3153733557803321, 0.0, 0.008066269791525428, 0.04603502041292726, 0.835153289660539, 0.7302411070187564, 0.6723815994338287, 0.4682921530991669, 0.61784401639502, 0.37480719446524086, 0.0]
0.468 3.0 450 0.3407 0.4236 0.5160 0.9222 [0.967628327243794, 0.0, 0.6552366972477064, 0.0, 0.6290085111004403, 0.4670381355483089, 0.6672147151783046, 0.383883893570567, 0.0, 0.2165883496207548, 0.16886930983847284, 0.6990637778386917, 0.5844660521614327, 0.5797863638252885, 0.5635731087436499, 0.5986930492860812, 0.44376046579658623, 0.0] [0.9867078128564305, 0.0, 0.8208199436428244, 0.0, 0.8762466521242056, 0.6131590299246492, 0.8315169152556456, 0.5000853892967014, 0.0, 0.26342651456814886, 0.1946269217204203, 0.856827664875574, 0.7383624321842127, 0.7423898089171974, 0.6493545064057199, 0.6988746063206779, 0.5148313684457393, 0.0]
0.2826 4.0 600 0.2798 0.4551 0.5533 0.9293 [0.9704567835904337, 0.0012846283896218, 0.6628886267200698, 0.0, 0.6545595921801918, 0.5437484428274499, 0.6924069605979732, 0.4294368657847134, 0.0, 0.3000079178653435, 0.2319497953477191, 0.712312533356426, 0.6370556857129019, 0.6220110253876812, 0.6080063117165435, 0.6351838258482077, 0.4900002446962097, 0.0] [0.9873710355453347, 0.0012846283896218, 0.8479478581990559, 0.0, 0.8435677609119634, 0.7389797371417705, 0.8298649536752014, 0.5549519370050572, 0.0, 0.40712021632850415, 0.2790950818673832, 0.8737095012573054, 0.7939287709364969, 0.7406539278131635, 0.7376293829838148, 0.7590265617282763, 0.5644590620856372, 0.0]
0.285 5.0 750 0.2480 0.4721 0.5761 0.9325 [0.9719933486526419, 0.02278384962526125, 0.6784888470926147, 0.0, 0.6575070855310532, 0.5547105384213228, 0.7129180414303309, 0.49382598215708845, 0.0, 0.3345492643374482, 0.2584484563669665, 0.7208913273095512, 0.6495685343255954, 0.6406974011147674, 0.6289914582523433, 0.6479963374487807, 0.5249157713980156, 0.0] [0.9863067860954057, 0.02278384962526125, 0.8435847424740851, 0.0, 0.8043595854480959, 0.6760888068407149, 0.8171256340624186, 0.7457634778323201, 0.0, 0.46972398672762206, 0.30755138754621136, 0.8880640267486244, 0.8222930494143935, 0.7778343949044586, 0.790811538272246, 0.773970340151763, 0.6441756943254286, 0.0]
0.2314 6.0 900 0.2295 0.4855 0.5841 0.9359 [0.9730456757750755, 0.0637654378431555, 0.6851240240801579, 0.0, 0.6712804182366192, 0.595825314509297, 0.7089200924431801, 0.5193061778733198, 0.0, 0.3552495306547412, 0.3065643676435385, 0.731074083138405, 0.6581214590064147, 0.6410021270484412, 0.638811317249175, 0.6591728612526099, 0.5321564146622513, 0.0] [0.9862461039828672, 0.0638520214592846, 0.8482769465672089, 0.0, 0.8695216069897929, 0.7054527891869578, 0.8734292437330271, 0.6676700900084057, 0.0, 0.49668062637028093, 0.38343317779418595, 0.8656622847629362, 0.8317493869358152, 0.7606239207360227, 0.7644012022286373, 0.7771181711470742, 0.6191245050197091, 0.0]
0.2178 7.0 1050 0.2208 0.4958 0.5886 0.9371 [0.9734572256785639, 0.1587942678306704, 0.6914719081623183, 0.0, 0.6601870822105664, 0.6270601644744053, 0.7162747021535327, 0.5124449676994741, 0.0, 0.3619236654764809, 0.3177490572194479, 0.7361379703589878, 0.6675998590236725, 0.6535390393232998, 0.6428418797248676, 0.6647232417213139, 0.540526721767411, 0.0] [0.9877832474957517, 0.160418802167228, 0.8362072026600993, 0.0, 0.8951839293491445, 0.7298165744011244, 0.8545759194141131, 0.6104045549838093, 0.0, 0.5009990099440503, 0.3903854985752817, 0.87941298037957, 0.8317257165039719, 0.7700597310686482, 0.7556186878329506, 0.7708032661753292, 0.6210221173881277, 0.0]
0.171 8.0 1200 0.2149 0.5031 0.6016 0.9382 [0.9738028781291189, 0.2090036563071298, 0.6954619055669778, 0.0, 0.6788968068357729, 0.6110085872408796, 0.7350502197799524, 0.5429843550868226, 0.0, 0.3594320656408074, 0.3347269860560705, 0.7407122718676398, 0.6661915078754087, 0.646810692733125, 0.6485688609619236, 0.6692661668757929, 0.5431693882334557, 0.0] [0.9868011813068851, 0.21306859782478468, 0.8581432567954529, 0.0, 0.8242681378222496, 0.7152717059809872, 0.8559814323106499, 0.7949015692126631, 0.0, 0.4839096721047259, 0.42070167693022054, 0.8844953113707054, 0.8490311692246514, 0.7584169851380043, 0.777752321214312, 0.7791996202718392, 0.6270825252789479, 0.0]
0.1615 9.0 1350 0.2072 0.5123 0.6111 0.9404 [0.9738930615479604, 0.25244414082891337, 0.6979447149962021, 0.0, 0.6899370517994793, 0.6465310914683069, 0.7354452381057439, 0.5556946534030943, 0.0, 0.3601541486614925, 0.34689383402874363, 0.7437578951763325, 0.6725374477193258, 0.6572256369218894, 0.6525889931816169, 0.6736342658892065, 0.5632821294256818, 0.0] [0.9872016504484974, 0.26072631424805975, 0.8553133504619919, 0.0, 0.8557708167111089, 0.7713662180559067, 0.8656042619570102, 0.7063312238926388, 0.0, 0.47889287696955896, 0.44806185028151824, 0.8843821654286037, 0.8333672609523088, 0.7867685774946921, 0.8004561816173064, 0.7995715229536693, 0.6652027727916437, 0.0]
0.179 10.0 1500 0.2073 0.5123 0.6061 0.9404 [0.9738435807240893, 0.25487077790797996, 0.6992917234103969, 0.0, 0.6874674997812054, 0.640439429039686, 0.739829923873258, 0.5614734173142479, 0.0, 0.36041378832602766, 0.34524546132802786, 0.7459134523284406, 0.6752988298594533, 0.6595964688647477, 0.6534596510166254, 0.6718737447469826, 0.5531669206163902, 0.0] [0.9881203940329015, 0.2634819419853832, 0.8494709222844186, 0.0, 0.8557345821246315, 0.7440582073040913, 0.8536968246833937, 0.7504572727657617, 0.0, 0.47917172993729645, 0.44030265431487925, 0.8766212774407773, 0.8252944601721314, 0.7851447983014862, 0.7807782580582752, 0.7778494044297343, 0.638974454957921, 0.0]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
1
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .