Edit model card

segformer-b0-miic-tl

This model is a fine-tuned version of nvidia/mit-b0 on the yijisuk/ic-chip-sample dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3415
  • Mean Iou: 0.4569
  • Mean Accuracy: 0.9138
  • Overall Accuracy: 0.9138
  • Accuracy Unlabeled: nan
  • Accuracy Circuit: 0.9138
  • Iou Unlabeled: 0.0
  • Iou Circuit: 0.9138
  • Dice Coefficient: 0.8323

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Circuit Iou Unlabeled Iou Circuit Dice Coefficient
0.3496 3.12 250 0.3203 0.4832 0.9665 0.9665 nan 0.9665 0.0 0.9665 0.8163
0.2808 6.25 500 0.3289 0.4814 0.9629 0.9629 nan 0.9629 0.0 0.9629 0.8271
0.2582 9.38 750 0.3404 0.4670 0.9339 0.9339 nan 0.9339 0.0 0.9339 0.8327
0.2791 12.5 1000 0.3033 0.4591 0.9181 0.9181 nan 0.9181 0.0 0.9181 0.8300
0.2668 15.62 1250 0.3117 0.4559 0.9118 0.9118 nan 0.9118 0.0 0.9118 0.8285
0.2531 18.75 1500 0.2652 0.4686 0.9373 0.9373 nan 0.9373 0.0 0.9373 0.8432
0.2326 21.88 1750 0.3256 0.4604 0.9208 0.9208 nan 0.9208 0.0 0.9208 0.8315
0.2361 25.0 2000 0.3129 0.4656 0.9313 0.9313 nan 0.9313 0.0 0.9313 0.8400
0.2167 28.12 2250 0.3135 0.4558 0.9116 0.9116 nan 0.9116 0.0 0.9116 0.8290
0.2133 31.25 2500 0.3132 0.4560 0.9120 0.9120 nan 0.9120 0.0 0.9120 0.8219
0.1769 34.38 2750 0.3200 0.4441 0.8882 0.8882 nan 0.8882 0.0 0.8882 0.8176
0.1899 37.5 3000 0.3342 0.4612 0.9224 0.9224 nan 0.9224 0.0 0.9224 0.8363
0.1765 40.62 3250 0.3445 0.4625 0.9249 0.9249 nan 0.9249 0.0 0.9249 0.8369
0.1739 43.75 3500 0.3235 0.4608 0.9216 0.9216 nan 0.9216 0.0 0.9216 0.8373
0.1639 46.88 3750 0.3527 0.4591 0.9181 0.9181 nan 0.9181 0.0 0.9181 0.8342
0.1734 50.0 4000 0.3415 0.4569 0.9138 0.9138 nan 0.9138 0.0 0.9138 0.8323

Framework versions

  • Transformers 4.36.2
  • Pytorch 1.11.0+cu115
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for yijisuk/segformer-b0-miic-tl

Base model

nvidia/mit-b0
Finetuned
this model