as-cle-bert's picture
Update README.md
67415e6 verified
|
raw
history blame
7.19 kB
metadata
license: cc
base_model: nvidia/segformer-b0-finetuned-cityscapes-1024-1024
tags:
  - generated_from_trainer
  - medical
  - computer-vision
  - image-segmentation
  - breast-cancer
model-index:
  - name: segformer-v1-breastcancer
    results: []
datasets:
  - as-cle-bert/breastcancer-semantic-segmentation
pipeline_tag: image-segmentation

segformer-v1-breastcancer

This model is a fine-tuned version of nvidia/segformer-b0-finetuned-cityscapes-1024-1024 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2084
  • Mean Iou: 0.6074
  • Mean Accuracy: 0.7133
  • Overall Accuracy: 0.6718
  • Per Category Iou: [0.6503515075769412, 0.5644565972298056]
  • Per Category Accuracy: [0.7843872475128127, 0.6421245639664888]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 3
  • eval_batch_size: 3
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
1.0349 1.82 20 0.9385 0.1001 0.3453 0.5410 [0.00702490904002239, 0.19315512632820392] [0.00945884835694905, 0.6811663337407948]
0.8631 3.64 40 0.8712 0.1270 0.3748 0.5931 [9.482867619168036e-05, 0.25396146547703435] [0.00011305396442568585, 0.7494807350208202]
0.6657 5.45 60 0.6510 0.1313 0.2115 0.3347 [0.00014806040864672785, 0.26239433754030744] [0.00015073861923424781, 0.42294505232402135]
0.6924 7.27 80 0.5721 0.1917 0.3061 0.4833 [0.002107933665379521, 0.38125252274350296] [0.0021291829966837506, 0.6101487731433172]
0.5177 9.09 100 0.4836 0.1991 0.3081 0.4876 [0.0, 0.3981538560328774] [0.0, 0.6162060363932699]
0.3851 10.91 120 0.4029 0.2127 0.2893 0.4440 [0.023690796530116853, 0.40166184061437743] [0.02372249020198975, 0.5548632022499826]
0.3266 12.73 140 0.3811 0.2300 0.3350 0.5130 [0.028268806709322924, 0.43178856750464767] [0.02946940006029545, 0.6405295012074774]
0.3397 14.55 160 0.3353 0.2616 0.3719 0.5640 [0.04190433583118965, 0.4812188969936601] [0.042357552004823634, 0.7015294713932202]
0.3008 16.36 180 0.3363 0.3885 0.4376 0.4135 [0.4227194892852987, 0.35420100310527275] [0.47910385890865237, 0.3961867565069616]
0.2558 18.18 200 0.3163 0.4200 0.4832 0.4322 [0.48242302607476784, 0.35761699452079404] [0.570677570093458, 0.3956699760492134]
0.2686 20.0 220 0.2771 0.4777 0.5444 0.5868 [0.4603203796001692, 0.49515000498355427] [0.4716046126017486, 0.6171352474086441]
0.1953 21.82 240 0.2811 0.4756 0.5676 0.5920 [0.46844517569632155, 0.4827354154204578] [0.5257386192342478, 0.6095276427854467]
0.1623 23.64 260 0.2612 0.4833 0.5416 0.5447 [0.506478482184174, 0.46020570281796136] [0.5361961109436237, 0.5469524860121443]
0.1851 25.45 280 0.2620 0.5107 0.5880 0.5313 [0.5881106780729983, 0.4333538137452822] [0.6852389207114863, 0.49066316846049113]
0.1315 27.27 300 0.2230 0.6652 0.7361 0.6967 [0.7577948727059535, 0.5726185409040606] [0.8037006331022007, 0.6684903053973743]
0.1294 29.09 320 0.2330 0.5189 0.6179 0.6328 [0.506419446816051, 0.5313992809888866] [0.5923462466083811, 0.6434165151108594]
0.1532 30.91 340 0.2326 0.5319 0.6251 0.6503 [0.5461152173144251, 0.5176845532961513] [0.581945281881218, 0.6683163888971706]
0.1074 32.73 360 0.2280 0.5790 0.6418 0.5960 [0.6624514966740577, 0.4955288623414331] [0.7205682845945132, 0.5631018753167765]
0.1184 34.55 380 0.2168 0.6385 0.7453 0.7145 [0.7140882114917724, 0.5629577265658137] [0.7980479348809165, 0.6925007205112151]
0.1411 36.36 400 0.2191 0.5935 0.6776 0.6459 [0.6633485862587079, 0.5236754959973609] [0.7320432619837203, 0.6231328821442413]
0.1224 38.18 420 0.2068 0.6114 0.6869 0.6689 [0.6632029659025639, 0.5596692813228747] [0.717949201085318, 0.6559037198254872]
0.0892 40.0 440 0.2096 0.5867 0.6817 0.6756 [0.6250170137471076, 0.548339821945447] [0.692191739523666, 0.6711785575862378]
0.103 41.82 460 0.2117 0.5693 0.6553 0.6511 [0.6029494984137872, 0.5356447598629901] [0.6625150738619234, 0.6480725082734564]
0.0996 43.64 480 0.2082 0.6011 0.7024 0.6743 [0.6408627400521119, 0.5614076241331366] [0.7507725354235755, 0.6540800810947796]
0.1095 45.45 500 0.2065 0.6254 0.7302 0.6836 [0.6779631615467104, 0.5728211009174312] [0.8100504974374435, 0.6502936704332012]
0.097 47.27 520 0.2083 0.6079 0.7042 0.6628 [0.6564823383005202, 0.5592888498683055] [0.7753052457039493, 0.6330858749987578]
0.0866 49.09 540 0.2084 0.6074 0.7133 0.6718 [0.6503515075769412, 0.5644565972298056] [0.7843872475128127, 0.6421245639664888]

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2