Edit model card

segformer-human-parser

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1695
  • Mean Iou: 0.6248
  • Mean Accuracy: 0.7318
  • Overall Accuracy: 0.9504
  • Per Category Iou: [0.9788481644896356, 0.5892312801423537, 0.7530359111050211, 0.40150078831301017, 0.759007420581381, 0.5918265856291685, 0.7295908937991514, 0.5957128350073513, 0.29234848352263776, 0.50965626208837, 0.5006675514608004, 0.7987689193921012, 0.7493932173414964, 0.7541494190147875, 0.723483844844196, 0.7233847443350387, 0.6382073069749079, 0.15805024780469318]
  • Per Category Accuracy: [0.9894757584408714, 0.7237625570776256, 0.8611644158457461, 0.437957157784744, 0.8837220958158657, 0.6895901282621771, 0.8687869923662133, 0.7556548829789819, 0.38556260819388344, 0.6658026398491514, 0.6585143772877097, 0.9066994160521716, 0.8668649943781122, 0.8502555890338424, 0.8532504098367223, 0.8396507214826516, 0.7513961299212337, 0.18413990548791004]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
No log 1.0 300 0.5492 0.2735 0.3609 0.8877 [0.9549821998088528, 0.0, 0.6118139617850337, 0.0, 0.5216242508548116, 0.2777339456209645, 0.5122677803943367, 0.18436029966897033, 0.0, 0.010782200543460695, 0.0, 0.6655903673410478, 0.41018015604066516, 0.3081475335034851, 0.12417407878017789, 0.27884415987406247, 0.06250047083056665, 0.0] [0.9832989935765462, 0.0, 0.8147291863838911, 0.0, 0.887714080264451, 0.4180678278558261, 0.7615240773456551, 0.21988538003594602, 0.0, 0.010903834066624764, 0.0, 0.8059236605430289, 0.7257109903446306, 0.38293121873469227, 0.12528263910811838, 0.2957484860744537, 0.06412150360266661, 0.0]
0.9562 2.0 600 0.2966 0.4208 0.5251 0.9210 [0.9688580375120208, 0.0012359208523592085, 0.6643013921043928, 0.0, 0.6609494007786153, 0.36114643842189653, 0.6204816778313804, 0.3903737425899652, 0.0, 0.329012741065434, 0.059303439224489586, 0.7059378067944898, 0.5766512154345281, 0.5851999343449144, 0.5939953707591535, 0.592555256128876, 0.46418725792175913, 0.0] [0.98745563536642, 0.0012359208523592085, 0.824324038618192, 0.0, 0.8425629808037797, 0.45267673334865505, 0.7586258718694472, 0.6064525033250334, 0.0, 0.5359371464487743, 0.06089112388167257, 0.8673788414816078, 0.7444751566097338, 0.7299181193277617, 0.7254326321372885, 0.7190841312347083, 0.5958355691285808, 0.0]
0.9562 3.0 900 0.2513 0.4607 0.5580 0.9263 [0.9725091160684688, 0.14855734979814772, 0.677348095665415, 0.0, 0.646576412225751, 0.3974497016759501, 0.5984044043047161, 0.3987109042877217, 0.0, 0.373744443990598, 0.3376357853041791, 0.7243261927420948, 0.6405549659551344, 0.6309876029203761, 0.608383061921565, 0.618438110945233, 0.5196700886261589, 0.0] [0.9852854166837305, 0.15234703196347033, 0.8082490855975236, 0.0, 0.9235572625372296, 0.45350434956944397, 0.894635411892739, 0.47534978111381, 0.0, 0.5060993086109365, 0.4390685525935831, 0.8596916158731512, 0.7836655155181953, 0.7033966705361421, 0.7075760902137561, 0.717594287218808, 0.6345816247441256, 0.0]
0.2758 4.0 1200 0.2141 0.5050 0.6025 0.9358 [0.9738308066334914, 0.39074635419023057, 0.7079980374728607, 0.0, 0.6900770649513168, 0.5073099614956236, 0.6570693425781712, 0.5139368408162509, 0.0, 0.38295206484223304, 0.380551019433701, 0.7462835279947642, 0.6562641635398792, 0.6512530926527663, 0.6473580442703908, 0.6381828941414001, 0.5465645036196742, 0.0] [0.9887685525392029, 0.4732420091324201, 0.8336129500035444, 0.0, 0.8687858535953492, 0.6127752404610202, 0.8742595710804784, 0.6595062357623179, 0.0, 0.49486360779384037, 0.5183446022409474, 0.8675685179659861, 0.7478070175438597, 0.7293468089690367, 0.7692559159910838, 0.7358143731417086, 0.6707010349799827, 0.0]
0.1986 5.0 1500 0.1979 0.5193 0.6180 0.9397 [0.9735201277974378, 0.4271586708192732, 0.7088750538777107, 0.0, 0.7158768245426267, 0.5574907653230701, 0.6885542934480133, 0.5408903936431009, 0.0, 0.40501379416047206, 0.3769302096342365, 0.755407287525701, 0.6670435298728543, 0.6705490185782534, 0.6558029975704469, 0.6593847752496036, 0.5452203944873667, 0.0] [0.990629057174256, 0.5001826484018265, 0.8611028545773601, 0.0, 0.8504929390187358, 0.6821886475993018, 0.8266017947564303, 0.68236251316186, 0.0, 0.5314921433060968, 0.4695401297116738, 0.8883199778355568, 0.8562057164783781, 0.7939059513818101, 0.7807633940089322, 0.7905452700663366, 0.6191165462097474, 0.0]
0.1986 6.0 1800 0.1880 0.5385 0.6506 0.9427 [0.9760795396969519, 0.49546113392674634, 0.7220136128809921, 0.0, 0.7309186937055082, 0.5440699209336682, 0.6956642876121553, 0.5778957468660312, 0.0, 0.45966320586196857, 0.42269142660206377, 0.7627396838617393, 0.683665616622042, 0.6876194866777059, 0.6761946354847002, 0.6771030717469252, 0.5804764385040639, 0.0] [0.9879277145076928, 0.6493089802130898, 0.8507195206429236, 0.0, 0.8673710483519189, 0.6383928353129293, 0.827153147058454, 0.7699004480072285, 0.0, 0.6786524198617222, 0.5424190472721786, 0.8977739653041217, 0.8809442540736379, 0.7833886957724105, 0.8262324347105663, 0.8358410233687171, 0.6755292486182233, 0.0]
0.1604 7.0 2100 0.1801 0.5512 0.6583 0.9451 [0.9768452011492089, 0.5139995533826512, 0.7321181285491519, 0.0, 0.7394251546579675, 0.5626216812369556, 0.6996068780072735, 0.5955600641301114, 0.013181504485852312, 0.4662455998760579, 0.45393305665267836, 0.7712977594476239, 0.7091649640409238, 0.7115348901036068, 0.6884791107150497, 0.6904370923226971, 0.596390281908532, 0.0] [0.9877702999704974, 0.6586423135464231, 0.8554081263361594, 0.0, 0.8912112719378347, 0.6376617946924984, 0.8599751263341213, 0.7450261479700625, 0.013225620311598385, 0.6506901319924576, 0.6123374376611315, 0.8868025659605302, 0.8339579875426103, 0.8336640073402126, 0.8096899342819004, 0.8012390750123618, 0.7724611854904065, 0.0]
0.1604 8.0 2400 0.1778 0.5539 0.6585 0.9448 [0.9770667220917759, 0.5170361657163364, 0.7280902499823723, 8.707767328456984e-05, 0.7367522095068177, 0.5689756877304252, 0.7008091784312203, 0.5725296206849773, 0.05353968775119725, 0.4611822190210144, 0.4653302901758269, 0.775234121421819, 0.7100158455815765, 0.706912520826242, 0.6926195102723859, 0.6954450806871421, 0.6085951266279969, 0.0] [0.9876449078033697, 0.6315616438356164, 0.8732515667031888, 8.707767328456984e-05, 0.8794785955208667, 0.6744662669985282, 0.8310730990791718, 0.761356934135649, 0.05457587997691864, 0.5956178504085481, 0.63011360827907, 0.8869666680874643, 0.870182354410951, 0.8459069846710893, 0.8025300106245062, 0.8150169212887151, 0.7134814589088436, 0.0]
0.1409 9.0 2700 0.1787 0.5645 0.6692 0.9457 [0.9774355149875643, 0.5510517221518259, 0.7312727926222036, 0.022312196608546116, 0.7350711309053871, 0.5614035464757676, 0.7147139035430374, 0.5703592039897254, 0.120412942591956, 0.48404448269485184, 0.46641653400561417, 0.7787625198600512, 0.7244436472900285, 0.7267460530216988, 0.700144511645333, 0.695542108555244, 0.6005887704113232, 0.0] [0.9884721112458781, 0.7281095890410959, 0.8299478781260998, 0.022313653779171022, 0.9151881242410469, 0.6386825621140032, 0.856831255022075, 0.7009491012717575, 0.1293594922100404, 0.6843871778755499, 0.6151054525210008, 0.8915828396061549, 0.8380003926397886, 0.822356778495578, 0.8175389284792854, 0.7944223964654735, 0.7719378437880223, 0.0]
0.125 10.0 3000 0.1732 0.5754 0.6771 0.9472 [0.9773518514328442, 0.5454509350317965, 0.7400147481796903, 0.0804068150208623, 0.7504513423907763, 0.5667231094077799, 0.707444281418412, 0.5841859784154606, 0.16627132750728257, 0.4526267217406859, 0.4755204054945095, 0.7842849242463608, 0.7288534900652005, 0.7226213774065801, 0.70712983974001, 0.7035594729392917, 0.6299971188122727, 0.03478374063949465] [0.9896322026032164, 0.6689436834094369, 0.8524227157349359, 0.0805468477882271, 0.8944258229648844, 0.662617418815372, 0.8473261274456662, 0.7218953905447576, 0.18444316214656664, 0.5598164676304211, 0.6716598463543632, 0.900672392481139, 0.856513581767236, 0.8560844553204215, 0.8319517201924911, 0.834762170805479, 0.7387464972541122, 0.035024662826394395]
0.125 11.0 3300 0.1739 0.5890 0.6979 0.9470 [0.9777330455634119, 0.5675078907706614, 0.7372831569029541, 0.16542927439892952, 0.7427074791988457, 0.5782845962135678, 0.7214436362376275, 0.5722878749931231, 0.2058636202800384, 0.49905860495575977, 0.4704070648273662, 0.7872068799542741, 0.7316628962354176, 0.734756476482697, 0.7077843148100431, 0.7094031848591091, 0.6194635152180895, 0.0729665150943521] [0.988806578075719, 0.754234398782344, 0.8685698011508848, 0.16686259143155696, 0.8557120644759385, 0.6962348963095896, 0.8524367212730329, 0.7689250359382745, 0.23500288517022505, 0.7050584538026399, 0.6138931452609452, 0.8923042496057286, 0.8697964073459336, 0.811100999739534, 0.8407349658429336, 0.8312735597639368, 0.7347066084587035, 0.07605118829981719]
0.1148 12.0 3600 0.1700 0.5980 0.7001 0.9481 [0.9778018061430781, 0.5708823248255425, 0.7425154552038586, 0.2236867444099711, 0.7444078345337019, 0.5950717101614158, 0.7234283678662421, 0.5765847328098312, 0.23484204042122556, 0.4801390680776628, 0.4897876605538644, 0.7882241215574549, 0.740445370114888, 0.7458225407883768, 0.7150568150507534, 0.7118838076150624, 0.6184038808400304, 0.08554807503758416] [0.9897158357577079, 0.7271841704718417, 0.843436636043563, 0.22757749912922326, 0.8871126392777582, 0.7430056038297727, 0.8390423499053861, 0.7170090257923349, 0.27718407386035776, 0.6100439974858579, 0.6864832943539261, 0.9074421380162824, 0.8351492923560172, 0.8592400607539796, 0.8317466015201217, 0.8177365287918777, 0.712046133564752, 0.09029009002793971]
0.1148 13.0 3900 0.1710 0.6074 0.7144 0.9484 [0.9783396931951305, 0.566654824138456, 0.7452299560328497, 0.2804236159553554, 0.7441416393473166, 0.5890870074936531, 0.7258735850051261, 0.5828579852893129, 0.2683477880620946, 0.48800175739135293, 0.49247722180877007, 0.7880866756343737, 0.7400328630323235, 0.744623035421065, 0.7099267029789523, 0.7139782725657176, 0.6311719921886055, 0.1442810807060551] [0.9886643605955049, 0.6797442922374429, 0.8574166404461264, 0.28879310344827586, 0.8775113250788533, 0.6809209400351095, 0.8584741452992447, 0.7561495673191986, 0.3399422965954991, 0.6367039597737272, 0.6593468629598508, 0.9177240953071054, 0.8675956167121772, 0.8506221709150505, 0.8458997899136098, 0.8436691261936412, 0.7386802514690003, 0.17215687627194648]
0.1079 14.0 4200 0.1740 0.6086 0.7102 0.9487 [0.977942105430483, 0.5768343105192344, 0.7490884365579236, 0.33361640430820216, 0.7508990053557766, 0.5692915941990903, 0.7201425857375091, 0.5911614079391055, 0.24133583561907512, 0.4868673699182174, 0.48424487890658796, 0.7959107866121614, 0.7422369582329509, 0.7456813503731655, 0.7169848463044369, 0.7136604053934179, 0.6272778540958716, 0.13243990384615384] [0.9900950549598935, 0.7151902587519026, 0.8672900729656489, 0.3506400208986416, 0.8714781487884443, 0.6356343183229586, 0.8553712182931714, 0.799373193071111, 0.2893941142527409, 0.6196404776869893, 0.6261228800882435, 0.9018775840757001, 0.8677339330013742, 0.8500079854825, 0.8463084247687207, 0.8266161917788866, 0.7184256026710301, 0.15203338967265703]
0.1021 15.0 4500 0.1712 0.6187 0.7237 0.9493 [0.9782585994033568, 0.5740924715372847, 0.7491715773047957, 0.367804599342951, 0.748062714983797, 0.5987804324794581, 0.7265324207967331, 0.5926331677105839, 0.2712994816751525, 0.5070523117524783, 0.49529704138289005, 0.7990307004764193, 0.7460380903478778, 0.7483911832010839, 0.7192368133134107, 0.7175594657148434, 0.6334052080040388, 0.164652203479062] [0.9893027356709104, 0.7036468797564688, 0.8529344826428314, 0.3923937652385928, 0.8803567268244968, 0.6857381065314445, 0.8468213027640663, 0.7854713742144883, 0.3412925562608194, 0.6801231929604022, 0.6553639393222006, 0.9000223775627637, 0.8557149167425175, 0.8472746995261232, 0.8353858554650505, 0.8262340981627397, 0.7411015349148411, 0.207126349556759]
0.1021 16.0 4800 0.1706 0.6186 0.7235 0.9496 [0.9785733461638714, 0.5773717122071097, 0.750095151508004, 0.3705804345608487, 0.7543896875914221, 0.5896111900384422, 0.7211858115423779, 0.598724412527825, 0.2816084038088942, 0.4991221198247389, 0.5039075245094231, 0.7986545855749791, 0.7465591063871141, 0.7489826949950806, 0.7194423946958957, 0.7197747616797256, 0.6252712948669017, 0.15062471757131873] [0.9896713958789595, 0.7223561643835616, 0.8701044800435779, 0.3954197143852316, 0.8787325157580742, 0.6899935453269634, 0.8538148693899903, 0.7724648461386967, 0.35392960184650896, 0.6375235700817096, 0.6750001951138294, 0.8970855888495801, 0.8740418250611269, 0.8347348123090049, 0.8343137899039949, 0.8387099363605423, 0.7271435479917591, 0.17706184678003517]
0.0975 17.0 5100 0.1723 0.6222 0.7301 0.9499 [0.978806920523348, 0.5866061881769667, 0.7537012389986214, 0.3952780503434327, 0.7541491676661445, 0.5841659503641633, 0.7225422833286509, 0.6005818579770271, 0.28203029430812404, 0.5106977620730271, 0.4931532948501751, 0.7991115256696534, 0.7453387360353214, 0.7515511097787273, 0.7226869236310253, 0.7204798555392766, 0.6375654138472647, 0.1602542675687825] [0.9891667516155837, 0.7448523592085236, 0.8655551644929528, 0.429706548241031, 0.883565785239796, 0.6592433851826135, 0.8723300706614971, 0.768198775229806, 0.36185804962492785, 0.6813048397234444, 0.6329752777770551, 0.903790332892886, 0.8699592636219236, 0.8491386933606876, 0.8456642239383105, 0.8352807264273926, 0.7565147209175482, 0.1918319478458832]
0.0975 18.0 5400 0.1718 0.6243 0.7300 0.9503 [0.978730780232829, 0.5890366204991634, 0.7527785649582212, 0.40151892678453965, 0.755079595556939, 0.5969949096729052, 0.73138008588973, 0.5988889144014531, 0.2887897988138566, 0.5094783824684405, 0.502135225079791, 0.7993543401282869, 0.748229647819225, 0.7549749431572005, 0.72446743106999, 0.7234529961894286, 0.6381000698161508, 0.14407203903521734] [0.9894557917447941, 0.7244809741248097, 0.8572947615713421, 0.43850139324277254, 0.8871163010189252, 0.6821654205561777, 0.863791787037498, 0.7684944486285562, 0.378199653779573, 0.6686687617850409, 0.6708299621999474, 0.9071203273517753, 0.8656826131962664, 0.8442219942697464, 0.8422076538109607, 0.8299667353793002, 0.7568349088789226, 0.1647890724707668]
0.0953 19.0 5700 0.1715 0.6239 0.7328 0.9501 [0.9789040595256554, 0.5894054746593737, 0.753103238666386, 0.40122089891675145, 0.7573341787966735, 0.5819368258421082, 0.7239200863306356, 0.5998856882408785, 0.28982979543900855, 0.5125684778408359, 0.5018867850090838, 0.7986256654354295, 0.7481727644728351, 0.7535231345325545, 0.7220330250580966, 0.723473256480269, 0.6384977675166612, 0.15656856859236892] [0.9890105377116778, 0.7282252663622527, 0.8720962047167125, 0.4378265412748171, 0.8839733828034565, 0.6621351520515592, 0.8736337745857762, 0.763339806788806, 0.3816387766878246, 0.6802991829038341, 0.6619041548839593, 0.9075273858744299, 0.8716681390658743, 0.8442187786392095, 0.8578800023716846, 0.8452263984947438, 0.7487032387564341, 0.18171156565830776]
0.0924 20.0 6000 0.1695 0.6248 0.7318 0.9504 [0.9788481644896356, 0.5892312801423537, 0.7530359111050211, 0.40150078831301017, 0.759007420581381, 0.5918265856291685, 0.7295908937991514, 0.5957128350073513, 0.29234848352263776, 0.50965626208837, 0.5006675514608004, 0.7987689193921012, 0.7493932173414964, 0.7541494190147875, 0.723483844844196, 0.7233847443350387, 0.6382073069749079, 0.15805024780469318] [0.9894757584408714, 0.7237625570776256, 0.8611644158457461, 0.437957157784744, 0.8837220958158657, 0.6895901282621771, 0.8687869923662133, 0.7556548829789819, 0.38556260819388344, 0.6658026398491514, 0.6585143772877097, 0.9066994160521716, 0.8668649943781122, 0.8502555890338424, 0.8532504098367223, 0.8396507214826516, 0.7513961299212337, 0.18413990548791004]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
3.72M params
Tensor type
F32
·
Unable to determine this model’s pipeline type. Check the docs .

Finetuned from