Edit model card

my_model_3

This model is a fine-tuned version of apple/deeplabv3-mobilevit-xx-small on an FrsECM/CelebAHQ_mask dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2560
  • Mean Iou: 0.5809
  • Mean Accuracy: 0.6806
  • Overall Accuracy: 0.9139
  • Per Category Iou: [0.8919686118230816, 0.6685126480236523, 0.8747044562496686, 0.8833085720795604, 0.711340654853021, 0.0017797375551187376, 0.5999932597419707, 0.43503524672708965, 0.4621466655632662, 0.5295999530392416, 0.5872745246930384, 0.47678709050791274, 0.7930179988260829, 0.5446353384631151, 0.6272271444587322, 0.6052765573405614, 0.5696758390032162, 0.2785029706405308, 0.4957813263783734, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
  • Per Category Accuracy: [0.9340795455586407, 0.7993107362784472, 0.9405089464670838, 0.9459331187430433, 0.8324080810556224, 0.0017886222269681519, 0.7019941140835427, 0.5005054410951127, 0.5404423454984336, 0.5945500675475304, 0.6696180612278237, 0.6095998812163179, 0.8718696974845856, 0.6992669162717129, 0.7405660623179267, 0.7106133092784797, 0.6685587984126187, 0.36160280101147635, 0.8080611214773792, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
0.5588 0.14 1000 0.4594 0.3034 0.3775 0.8700 [0.8441280428123001, 0.5693282558240229, 0.7975983961437951, 0.8498378142283486, 0.5761384911910784, 0.0, 0.12270583146229427, 0.06161763107077589, 0.002664418305407915, 0.0014959742265011834, 0.0, 0.009876162443763535, 0.6699313576513388, 0.1611794683801089, 0.29689850050845384, 0.40252735203646844, 0.2101523887973641, 0.002859159901164843, 0.4893042689942933, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9256661621269769, 0.681558875502008, 0.9326575335612457, 0.9256377345009891, 0.7357490216500608, 0.0, 0.12812084410207514, 0.06894853769663686, 0.002673759826136486, 0.001537667358616978, 0.0, 0.009941384302331466, 0.7811656980059632, 0.20498599904467435, 0.3748039210577058, 0.5131612623786742, 0.22835503272033136, 0.0029023638653139366, 0.6555286234432215, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.4188 0.28 2000 0.3415 0.4454 0.5268 0.8937 [0.8764819541275325, 0.6302661063115487, 0.8387923788103758, 0.8669951068151325, 0.6498564698554723, 0.0, 0.4527335101762678, 0.24169950114561772, 0.21804654807774423, 0.08953576565973584, 0.27393886497928394, 0.2445465361251712, 0.7341006916653414, 0.3676319928106044, 0.5113383150644849, 0.5189963435433752, 0.4480486002340621, 0.03770827106993425, 0.4616427383079672, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9263182199537994, 0.7792085676037483, 0.9323345405550992, 0.9417726033591793, 0.7782645488868637, 0.0, 0.5548963870796606, 0.28446630344424073, 0.24003008325430028, 0.09400090063373862, 0.29738839698979536, 0.2753689211944509, 0.8192402317337436, 0.4680323437799433, 0.6314341252924899, 0.6361238037453264, 0.521879010124351, 0.041334370744991245, 0.7874424845133146, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.2878 0.43 3000 0.2944 0.5206 0.6078 0.9047 [0.8854988296644709, 0.6519540194302628, 0.8572464962098427, 0.8754505136415879, 0.6765368987181419, 0.0, 0.5327179018789144, 0.3243876143913752, 0.33155747754174003, 0.3568416085045081, 0.486924442178474, 0.37113329727675376, 0.7655284850568753, 0.4561619920055543, 0.5837876209545264, 0.5597729067188029, 0.5165728665497245, 0.17104933633859254, 0.4883077052383161, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9273581392992951, 0.7939152878179384, 0.9405569705528336, 0.9451731224764754, 0.8077724254654342, 0.0, 0.6439414849572178, 0.3747563310479865, 0.3701401581029854, 0.39448856085318573, 0.5512627830234914, 0.428390649834479, 0.8369053909052186, 0.5806864043430134, 0.708576749163136, 0.6661744728604335, 0.6065226028304342, 0.20277644120025798, 0.7689479953598335, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.2627 0.57 4000 0.2755 0.5614 0.6658 0.9096 [0.8846708436689866, 0.6470210644465468, 0.8667446237928943, 0.8813140447326048, 0.6962080344743431, 0.0, 0.5602997262978417, 0.38799053760653296, 0.4312441727010948, 0.480262505138127, 0.5559833588627013, 0.4323176311512354, 0.7823147670212306, 0.5063772292885359, 0.6012913438917675, 0.593308147367188, 0.5506520810251795, 0.25707177962502276, 0.5505954536437292, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9357588059598548, 0.7608224899598394, 0.9378491950141642, 0.9393328129837988, 0.8290981566432067, 0.0, 0.6714353821723404, 0.45083897428767566, 0.5205066595145205, 0.5582226762002043, 0.6542239792640028, 0.534071006995414, 0.8625244686719534, 0.6603388259791373, 0.7126103731189016, 0.7251929847169348, 0.6650287473113614, 0.32531045567624567, 0.9075535059468354, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.2641 0.71 5000 0.2623 0.5696 0.6668 0.9120 [0.8892892227722897, 0.6648377331251554, 0.8720944303146551, 0.8803700123155216, 0.7057805874888583, 0.0, 0.5988695721747876, 0.4282063786694269, 0.4409408981430017, 0.4854531697402193, 0.5761256783558699, 0.4422358774694712, 0.7884019813396796, 0.5192982132734839, 0.6211020547667461, 0.596838916967618, 0.5553414881389638, 0.2587870728463894, 0.49913310800201643, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9329897829365719, 0.7914778045515395, 0.9402721199368602, 0.9474222357759178, 0.8159128330410409, 0.0, 0.7512537145438056, 0.49848078020408815, 0.5132872314834528, 0.5390787175854229, 0.6596040121540824, 0.5534779659644257, 0.8649550428158117, 0.644598834731522, 0.7449974898421119, 0.7078248076417115, 0.6542216762491038, 0.325740435508144, 0.7837236038626532, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.2675 0.85 6000 0.2554 0.5795 0.6773 0.9134 [0.8910400464786082, 0.6668531904215245, 0.8742927195270097, 0.8806797580973037, 0.7063579299285884, 0.0012783296492583024, 0.5852087012059962, 0.44287665224585815, 0.4730663238884368, 0.5327117185133179, 0.5859973071744566, 0.4591704694170641, 0.791660787618116, 0.5388588717907311, 0.6314070927422667, 0.5969867362256072, 0.5678845348501144, 0.26967988187340874, 0.5145654951798955, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9260822591882713, 0.7923114323962517, 0.9380834914484188, 0.95558258877134, 0.8226199866745292, 0.0012814009984249445, 0.6700266335927391, 0.5147504800966276, 0.5657808512344364, 0.6083616154293936, 0.6635954738022892, 0.6322497696878892, 0.8717189867584043, 0.6871880532808572, 0.7611806254027979, 0.6947199238262968, 0.6595167529009897, 0.3356248528342837, 0.7681071598184418, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
0.2389 0.99 7000 0.2560 0.5809 0.6806 0.9139 [0.8919686118230816, 0.6685126480236523, 0.8747044562496686, 0.8833085720795604, 0.711340654853021, 0.0017797375551187376, 0.5999932597419707, 0.43503524672708965, 0.4621466655632662, 0.5295999530392416, 0.5872745246930384, 0.47678709050791274, 0.7930179988260829, 0.5446353384631151, 0.6272271444587322, 0.6052765573405614, 0.5696758390032162, 0.2785029706405308, 0.4957813263783734, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.9340795455586407, 0.7993107362784472, 0.9405089464670838, 0.9459331187430433, 0.8324080810556224, 0.0017886222269681519, 0.7019941140835427, 0.5005054410951127, 0.5404423454984336, 0.5945500675475304, 0.6696180612278237, 0.6095998812163179, 0.8718696974845856, 0.6992669162717129, 0.7405660623179267, 0.7106133092784797, 0.6685587984126187, 0.36160280101147635, 0.8080611214773792, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
24
Safetensors
Model size
1.86M params
Tensor type
F32
·

Finetuned from