Edit model card

segformer-b0-finetuned

This model is a fine-tuned version of nvidia/mit-b1 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9071
  • Mean Iou: 0.1088
  • Mean Accuracy: 0.1908
  • Overall Accuracy: 0.6298
  • Per Category Iou: [0.8815300289358812, nan, nan, 0.0, nan, 0.0, nan, nan, 0.00020367428408489145, 0.0, 1.922160200519752e-05, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.5023311664113287, nan, 0.0, nan, nan, nan, nan, 0.04263591289343206, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.11130254996554101, 0.002219568005846906, 0.14581909826596703, 0.0, nan, nan, 0.11577986428757406, nan, 0.0, nan, nan, nan, 0.4133480309607676, 0.7698399394502776, nan, nan, nan, nan, nan, 0.0, 0.051393904376715295, 0.0015747565051265975, nan, nan, 0.15364890672274317, nan, 0.0, 0.317072675103839, nan, nan, 0.0, nan, nan, nan, 0.008494562606454995, nan, 0.0011486708237610766, nan, 0.48252163080599647, 0.08222372349467169, nan, 0.31241471271816407, nan, 0.03380186646422355, nan, nan, nan, 0.0, 0.02517664559315731, nan, 0.008166590874639989, nan, nan, nan, nan, nan, nan, 0.0]
  • Per Category Accuracy: [0.9449922698971936, nan, nan, 0.0, nan, 0.0, nan, nan, 0.00020691767159682507, 0.0, 2.0820407330449014e-05, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.6645813703792975, nan, 0.0, nan, nan, nan, nan, 0.04263591289343206, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.19053036287986516, 0.002231234822055941, 0.5107647161760153, 0.0, nan, nan, 0.459965776165909, nan, 0.0, nan, nan, nan, 0.8659097975107563, 0.8084077003909939, nan, nan, nan, nan, nan, nan, 0.06540741762029505, 0.001804114965670916, nan, nan, 0.6114336948168388, nan, 0.0, 0.4204204366030964, nan, nan, 0.0, nan, nan, nan, 0.008494562606454995, nan, 0.001256055984209582, nan, 0.627599889776798, 0.1183779119930975, nan, 0.9551009134116257, nan, 0.2985842985842986, nan, nan, nan, 0.0, 0.025185074960008928, nan, 0.008194227917411308, nan, nan, nan, nan, nan, nan, 0.0]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
3.3724 1.0 80 3.3682 0.0227 0.0633 0.4885 [0.8467888696494846, nan, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.09441726434636283, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 1.1622095928779797e-05, 0.0, 0.005814745356061479, nan, 0.01891412131970374, nan, nan, nan, nan, 0.010668900550041095, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 0.005783748098395477, 8.252266946272889e-05, 0.07968389394162366, 0.0, 0.0, nan, 0.0393799636026304, nan, 0.0, nan, 0.0, 0.0, 0.007038921020019595, 0.00012794268167860799, nan, 0.0, nan, nan, nan, 0.0, 0.007320843170893279, 0.0023965662643019876, nan, 0.0, 0.0011544414279552433, nan, 0.0, 0.0001130195873708684, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 2.149127991317523e-05, 0.0406427281897089, nan, 0.1285392001976426, 0.0, 0.0058999248115207716, 0.0, 0.0, nan, 0.03073764607919143, 0.002789501331352908, nan, 0.013104387784148672, nan, nan, nan, nan, nan, 0.0, 0.0] [0.9092377477775717, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.23971784183985775, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 1.163494205798855e-05, 0.0, 0.006080592501698183, nan, 0.07183540625363245, nan, nan, nan, nan, 0.014743463731079222, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.006367514615263074, 8.382539662972596e-05, 0.2434438476759682, 0.0, nan, nan, 0.5861895762033679, nan, 0.0, nan, nan, nan, 0.0071420333169523315, 0.00012801980039579456, nan, nan, nan, nan, nan, nan, 0.007923158233374696, 0.0026920336164870094, nan, nan, 0.0012950787009364416, nan, 0.0, 0.0001131654532815287, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, 2.2044640396803526e-05, 0.09923915601223625, nan, 0.188311379070587, nan, 0.058998848472532686, nan, nan, nan, 0.06181072810208441, 0.0028644767679773817, nan, 0.024506635233278833, nan, nan, nan, nan, nan, nan, 0.0]
3.1916 2.0 160 2.7928 0.0434 0.1066 0.5595 [0.8525004870189609, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.004728497697561579, nan, nan, nan, 0.00035642152786028277, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0013362353978399824, 0.0, nan, nan, 0.0, 0.0, 0.07106918848700475, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.04046728022232925, 0.0, 0.13411651586754517, 0.0, 0.0, nan, 0.07047441066870301, nan, 0.0, nan, nan, 0.0, 0.32528133273677945, 0.11296406903272453, nan, nan, nan, nan, nan, 0.0, 0.0054949546325646454, 0.00014381440750711207, nan, nan, 0.1409758586428207, nan, 0.0, 0.04062160004847195, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, 3.4858995363753616e-05, nan, 0.0018346608074704751, 0.024640252316183716, nan, 0.1464206904454781, nan, 0.0061474905943393904, nan, nan, nan, 0.0016092066123762619, 0.009260508737602051, nan, 0.004144189203672609, nan, nan, nan, nan, nan, 0.0, 0.0] [0.9268010267366377, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.005096835714493918, nan, nan, nan, 0.0003770217792914504, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0013901030395964402, 0.0, nan, nan, 0.0, 0.0, 0.08031860113504394, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0482382682888292, 0.0, 0.6977156779603979, 0.0, nan, nan, 0.4022747782261497, nan, 0.0, nan, nan, nan, 0.5523272723625751, 0.11500978818057193, nan, nan, nan, nan, nan, nan, 0.005560917321568087, 0.00018097704984786618, nan, nan, 0.3735661628668204, nan, 0.0, 0.041547887847646965, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 3.5887313834559486e-05, nan, 0.0018407274731330945, 0.027453133579104245, nan, 0.9492527930965511, nan, 0.01693422746054325, nan, nan, nan, 0.0019054217910964837, 0.010211673672854433, nan, 0.0044108140993954145, nan, nan, nan, nan, nan, nan, 0.0]
2.6198 3.0 240 2.5865 0.0471 0.1105 0.5557 [0.8679184663370483, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.005656437136100239, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.05370877461442435, nan, 0.0, nan, nan, nan, nan, 0.00946345038694518, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0029512460004772643, 0.0, 0.12176925503681421, 0.0, 0.0, nan, 0.057959139723436866, nan, 0.0, nan, nan, 0.0, 0.3396281526697906, 0.031032710721439732, nan, nan, nan, nan, nan, nan, 0.08261479707605197, 0.0172495012493189, nan, nan, 0.1195494594578602, nan, 0.0, 0.1039967756081857, nan, nan, 0.0, nan, nan, nan, 0.0006501042784715132, nan, 0.0, nan, 0.0015167552349879424, 0.00464944989181973, nan, 0.14581059648547645, 0.0, 0.018741633199464525, nan, nan, nan, 0.006616680837906779, 0.032363019625488904, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0] [0.9214364074380388, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.006812437278522917, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.05845038017398164, nan, 0.0, nan, nan, nan, nan, 0.009588711967324116, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.003191657449834097, 0.0, 0.718044394190257, 0.0, nan, nan, 0.29327782602033, nan, 0.0, nan, nan, nan, 0.516016608330241, 0.03156754911426301, nan, nan, nan, nan, nan, nan, 0.10876418953076887, 0.022469431845174134, nan, nan, 0.5750576381180087, nan, 0.0, 0.11054109253161896, nan, nan, 0.0, nan, nan, nan, 0.000650740271651308, nan, 0.0, nan, 0.0015321025075778452, 0.004753313985410621, nan, 0.9658446914497433, nan, 0.024656235182550973, nan, nan, nan, 0.012356371614989318, 0.03478293218258249, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0]
2.1644 4.0 320 2.3506 0.0765 0.1407 0.5969 [0.8603524070492433, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0002092622700127167, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.19935030050700064, nan, 0.0, nan, nan, nan, nan, 0.0008950987883418841, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.03200840511364548, 0.0001446390397928965, 0.14049749391759841, 0.0, 0.0, nan, 0.05448735265778786, nan, 0.0, nan, nan, nan, 0.43600618002295666, 0.4323473331615563, nan, nan, nan, nan, nan, nan, 0.03161891341788007, 0.0027459676930720665, nan, nan, 0.08848145715164914, nan, 0.0, 0.24103335003618953, nan, nan, 0.0, nan, nan, nan, 0.001209263746098269, nan, 0.0, nan, 0.21952586389206108, 0.08604763767063336, nan, 0.2212086270842507, nan, 0.014455966568642625, nan, nan, nan, 9.186321567186459e-05, 0.07102638898310834, nan, 0.0009096077316657191, nan, nan, nan, nan, nan, nan, 0.0] [0.9470790374097539, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.00021653223623666973, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.2601945789600543, nan, 0.0, nan, nan, nan, nan, 0.0008955289081099972, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.04203929004055406, 0.00014546171768099506, 0.7112627710697411, 0.0, nan, nan, 0.20605387274182288, nan, 0.0, nan, nan, nan, 0.6881098998084426, 0.447519883075249, nan, nan, nan, nan, nan, nan, 0.03712486787076612, 0.0032038593355880056, nan, nan, 0.5769931403523753, nan, 0.0, 0.2916165954442822, nan, nan, 0.0, nan, nan, nan, 0.0012097654714591432, nan, 0.0, nan, 0.228834389639019, 0.14496823280257276, nan, 0.9282414683163319, nan, 0.03163313689629479, nan, nan, nan, 0.00011548010855130203, 0.0778988876901901, nan, 0.0009125822274611201, nan, nan, nan, nan, nan, nan, 0.0]
1.9229 5.0 400 2.1633 0.0870 0.1542 0.6050 [0.8773283817065187, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0001220414201183432, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.3697299055414764, nan, 0.0, nan, nan, nan, nan, 0.021168574133035023, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.07625496459886678, 0.0, 0.1397230259276946, 0.0, nan, nan, 0.07665436321672486, nan, 0.0, nan, nan, nan, 0.39410758176906385, 0.6336055710591669, nan, nan, nan, nan, nan, nan, 0.04560313828048382, 0.002229418819041984, nan, nan, 0.0980270699788945, nan, 0.0, 0.2589466636911404, nan, nan, 0.0, nan, nan, nan, 0.0009520274253772081, nan, 0.0003587572648346129, nan, 0.2360707902430961, 0.0390448957001858, nan, 0.1935092262868242, nan, 0.008375591134408295, nan, nan, nan, 0.0, 0.006497232660163264, nan, 0.0012918687161820599, nan, nan, nan, nan, nan, nan, 0.0] [0.9391028302756103, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0001374146883809635, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.5978044130858733, nan, 0.0, nan, nan, nan, nan, 0.02149269379463993, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.1260968030757887, 0.0, 0.6829382546541211, 0.0, nan, nan, 0.25574653930632885, nan, 0.0, nan, nan, nan, 0.738322267799296, 0.6600487542073173, nan, nan, nan, nan, nan, nan, 0.05128912174272715, 0.002386634844868735, nan, nan, 0.4977371701818803, nan, 0.0, 0.31264381443021194, nan, nan, 0.0, nan, nan, nan, 0.0009520897934227192, nan, 0.00035887313834559486, nan, 0.2518600165334803, 0.04648207702564907, nan, 0.9564819841700717, nan, 0.01991465149359886, nan, nan, nan, 0.0, 0.006528774971169228, nan, 0.0013118369519753603, nan, nan, nan, nan, nan, nan, 0.0]
2.7141 6.0 480 1.9697 0.0905 0.1621 0.6164 [0.875784639037412, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.4103656175673433, nan, 0.0, nan, nan, nan, nan, 0.017552285726760687, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.07870038822509517, 0.0, 0.15465560109157453, 0.0, nan, nan, 0.10557401799947705, nan, 0.0, nan, nan, nan, 0.397134527506845, 0.615739079623156, nan, nan, nan, nan, nan, nan, 0.05848507675922799, 1.8869043447320325e-05, nan, nan, 0.10896633214040169, nan, 0.0, 0.3025006756569833, nan, nan, 0.0, nan, nan, nan, 0.0005590251998078351, nan, 0.0, nan, 0.20771130598246157, 0.059782752332373075, nan, 0.21166027063959567, nan, 0.015185313703134553, nan, nan, nan, 0.0, 1.857527630723507e-05, nan, 0.0006083534533564001, nan, nan, nan, nan, nan, nan, 0.0] [0.9424563335460676, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.5689791178210662, nan, 0.0, nan, nan, nan, nan, 0.01756110346635214, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.10996997945963027, 0.0, 0.5594852535368628, 0.0, nan, nan, 0.4262399754814323, nan, 0.0, nan, nan, nan, 0.8420817077696096, 0.6407764400894005, nan, nan, nan, nan, nan, nan, 0.08277953950089618, 1.9794364827110363e-05, nan, nan, 0.5932172031992713, nan, 0.0, 0.3830084766313339, nan, nan, 0.0, nan, nan, nan, 0.0005590251998078351, nan, 0.0, nan, 0.22896665748139983, 0.07097027217821006, nan, 0.9545103866045661, nan, 0.062250220144956986, nan, nan, nan, 0.0, 1.8600498493359623e-05, nan, 0.0006083881516407468, nan, nan, nan, nan, nan, nan, 0.0]
2.8667 7.0 560 1.9407 0.1027 0.1798 0.6270 [0.8755148354236076, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0004734830258259969, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.46088846335050687, nan, 0.0, nan, nan, nan, nan, 0.011729244479391915, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.06853778575779923, 8.127398198180448e-05, 0.15649273924386428, 0.0, nan, nan, 0.11218457353120695, nan, 0.0, nan, nan, nan, 0.4505762766705812, 0.7143217802142316, nan, nan, nan, nan, nan, 0.0, 0.04256552004948751, 0.007647910396586537, nan, nan, 0.1562136593938714, nan, 0.0, 0.2898548692140018, nan, nan, 0.0, nan, nan, nan, 0.01209596422738666, nan, 0.0, nan, 0.43981007910529185, 0.10835580746903141, nan, 0.28202050654872796, nan, 0.015566793850044025, nan, nan, nan, 0.0, 0.0031614966897269953, nan, 0.0026399756894325004, nan, nan, nan, nan, nan, nan, 0.0] [0.9516613558945399, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0005121820203290458, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.6645046782216184, nan, 0.0, nan, nan, nan, nan, 0.011729244479391915, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.09006688787064834, 8.13599437876752e-05, 0.6428919490099476, 0.0, nan, nan, 0.492899831434847, nan, 0.0, nan, nan, nan, 0.7820984063625149, 0.7395383819364062, nan, nan, nan, nan, nan, nan, 0.04996553150420516, 0.009082785689239783, nan, nan, 0.5945549768024364, nan, 0.0, 0.3898199591526602, nan, nan, 0.0, nan, nan, nan, 0.01209765471459143, nan, 0.0, nan, 0.568696610636539, 0.14874892148403795, nan, 0.9444333323808708, nan, 0.09340919867235657, nan, nan, nan, 0.0, 0.0031620847438711356, nan, 0.002642686033689494, nan, nan, nan, nan, nan, nan, 0.0]
1.6915 8.0 640 1.9072 0.1093 0.1853 0.6315 [0.8772425118264209, nan, nan, 0.0, nan, 0.0, nan, nan, 0.003156203374268335, 0.0, 0.002018674689404686, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.45600371323153416, nan, 0.0, nan, nan, nan, nan, 0.049884294633890755, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.10574545118806593, 0.00030984343070450035, 0.15728494074766308, 0.0, nan, nan, 0.10957834434615522, nan, 0.0, nan, nan, nan, 0.40828881095838826, 0.7623622808579934, nan, nan, nan, nan, nan, nan, 0.0599020226027065, 0.010532844786988434, nan, nan, 0.1926762714547808, nan, 0.0, 0.3115450053986847, nan, nan, 0.0, nan, nan, nan, 0.006965978075730445, nan, 0.004690606361829026, nan, 0.4270682482148986, 0.0905241377651064, nan, 0.28274099594279106, nan, 0.028024685876659416, nan, nan, nan, 0.0, 0.016888310239003067, nan, 0.010560181680545042, nan, nan, nan, nan, nan, nan, 0.0] [0.9518911543885634, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0031865321425911058, 0.0, 0.0021569941994345177, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.6350548896728532, nan, 0.0, nan, nan, nan, nan, 0.049909355000764474, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.16889977352925686, 0.0003106470580983962, 0.4997832788165139, 0.0, nan, nan, 0.45636886822972533, nan, 0.0, nan, nan, nan, 0.8699001594640403, 0.7990995940705495, nan, nan, nan, nan, nan, nan, 0.07946137230571258, 0.012965308961757287, nan, nan, 0.5888765548060229, nan, 0.0, 0.4104888208698651, nan, nan, 0.0, nan, nan, nan, 0.006965978075730445, nan, 0.005418984389018482, nan, 0.5623367318820611, 0.11804847439014825, nan, 0.9505005190921126, nan, 0.21377768746189799, nan, nan, nan, 0.0, 0.016889252631970535, nan, 0.010608768394235522, nan, nan, nan, nan, nan, nan, 0.0]
1.8736 9.0 720 1.8703 0.1104 0.1908 0.6318 [0.8705214677311887, nan, nan, 0.0, nan, 0.0, nan, nan, 0.007043629790075404, 0.0, 0.0007028420713865594, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.48323459400263136, nan, 0.0, nan, nan, nan, nan, 0.03907614390499476, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.13195792053927075, 0.001797306985926006, 0.13382265047595446, 0.0, nan, nan, 0.11776142740081341, nan, 0.0, nan, nan, nan, 0.4367590967115781, 0.7731876942800889, nan, nan, nan, nan, nan, 0.0, 0.0467442090255918, 0.011127882138218248, nan, nan, 0.17179935113551284, nan, 0.0, 0.3133920993554191, nan, nan, 0.0, nan, nan, nan, 0.016547544719848758, nan, 0.0023474923072042634, nan, 0.4340608716523301, 0.08896054430287283, nan, 0.33867263472971293, nan, 0.032804060346433225, nan, nan, nan, 0.0, 0.044601506556309865, nan, 0.02781251184115797, nan, nan, nan, nan, nan, nan, 0.0] [0.9601470255315228, nan, nan, 0.0, nan, 0.0, nan, nan, 0.007159351437250147, 0.0, 0.0007911754785570624, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.6961566272980257, nan, 0.0, nan, nan, nan, nan, 0.03909748159797305, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.23135829778269343, 0.001804711480381159, 0.4942912140424558, 0.0, nan, nan, 0.4661422417462669, nan, 0.0, nan, nan, nan, 0.7819291638668525, 0.8106693835313196, nan, nan, nan, nan, nan, nan, 0.060875959373132955, 0.013154769310816772, nan, nan, 0.5878234138843822, nan, 0.0, 0.41488071822341016, nan, nan, 0.0, nan, nan, nan, 0.01655238677556012, nan, 0.0026556612237574016, nan, 0.6179333149627997, 0.13373597929249353, nan, 0.9354230362602509, nan, 0.28632391790286527, nan, nan, nan, 0.0, 0.04460399538707637, nan, 0.02790980645651926, nan, nan, nan, nan, nan, nan, 0.0]
2.1309 10.0 800 1.9071 0.1088 0.1908 0.6298 [0.8815300289358812, nan, nan, 0.0, nan, 0.0, nan, nan, 0.00020367428408489145, 0.0, 1.922160200519752e-05, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.5023311664113287, nan, 0.0, nan, nan, nan, nan, 0.04263591289343206, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.11130254996554101, 0.002219568005846906, 0.14581909826596703, 0.0, nan, nan, 0.11577986428757406, nan, 0.0, nan, nan, nan, 0.4133480309607676, 0.7698399394502776, nan, nan, nan, nan, nan, 0.0, 0.051393904376715295, 0.0015747565051265975, nan, nan, 0.15364890672274317, nan, 0.0, 0.317072675103839, nan, nan, 0.0, nan, nan, nan, 0.008494562606454995, nan, 0.0011486708237610766, nan, 0.48252163080599647, 0.08222372349467169, nan, 0.31241471271816407, nan, 0.03380186646422355, nan, nan, nan, 0.0, 0.02517664559315731, nan, 0.008166590874639989, nan, nan, nan, nan, nan, nan, 0.0] [0.9449922698971936, nan, nan, 0.0, nan, 0.0, nan, nan, 0.00020691767159682507, 0.0, 2.0820407330449014e-05, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.6645813703792975, nan, 0.0, nan, nan, nan, nan, 0.04263591289343206, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.19053036287986516, 0.002231234822055941, 0.5107647161760153, 0.0, nan, nan, 0.459965776165909, nan, 0.0, nan, nan, nan, 0.8659097975107563, 0.8084077003909939, nan, nan, nan, nan, nan, nan, 0.06540741762029505, 0.001804114965670916, nan, nan, 0.6114336948168388, nan, 0.0, 0.4204204366030964, nan, nan, 0.0, nan, nan, nan, 0.008494562606454995, nan, 0.001256055984209582, nan, 0.627599889776798, 0.1183779119930975, nan, 0.9551009134116257, nan, 0.2985842985842986, nan, nan, nan, 0.0, 0.025185074960008928, nan, 0.008194227917411308, nan, nan, nan, nan, nan, nan, 0.0]

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
13.7M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for MF21377197/segformer-b0-finetuned

Base model

nvidia/mit-b1
Finetuned
(15)
this model