unreal-hug's picture
End of training
4b3be16 verified
---
license: other
base_model: nvidia/mit-b3
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b2-seed63-apr-13-v1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b2-seed63-apr-13-v1
This model is a fine-tuned version of [nvidia/mit-b3](https://huggingface.co/nvidia/mit-b3) on the unreal-hug/REAL_DATASET_SEG_401_6_lbls dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7138
- Mean Iou: 0.1266
- Mean Accuracy: 0.2136
- Overall Accuracy: 0.4273
- Accuracy Unlabeled: nan
- Accuracy Lv: 0.6939
- Accuracy Rv: 0.0982
- Accuracy Ra: 0.1706
- Accuracy La: 0.5041
- Accuracy Vs: 0.0
- Accuracy As: 0.0
- Accuracy Mk: 0.0
- Accuracy Tk: nan
- Accuracy Asd: 0.0557
- Accuracy Vsd: 0.2283
- Accuracy Ak: 0.3849
- Iou Unlabeled: 0.0
- Iou Lv: 0.4965
- Iou Rv: 0.0899
- Iou Ra: 0.1288
- Iou La: 0.2845
- Iou Vs: 0.0
- Iou As: 0.0
- Iou Mk: 0.0
- Iou Tk: 0.0
- Iou Asd: 0.0462
- Iou Vsd: 0.1513
- Iou Ak: 0.3225
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.05
- training_steps: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Lv | Accuracy Rv | Accuracy Ra | Accuracy La | Accuracy Vs | Accuracy As | Accuracy Mk | Accuracy Tk | Accuracy Asd | Accuracy Vsd | Accuracy Ak | Iou Unlabeled | Iou Lv | Iou Rv | Iou Ra | Iou La | Iou Vs | Iou As | Iou Mk | Iou Tk | Iou Asd | Iou Vsd | Iou Ak |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:-----------:|:------------:|:------------:|:-----------:|:-------------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:-------:|:-------:|:------:|
| 2.5423 | 2.5 | 100 | 2.6367 | 0.0332 | 0.0976 | 0.0951 | nan | 0.0612 | 0.0642 | 0.0301 | 0.1898 | 0.0 | 0.0 | 0.0086 | nan | 0.0495 | 0.4697 | 0.1033 | 0.0 | 0.0573 | 0.0485 | 0.0262 | 0.1021 | 0.0 | 0.0 | 0.0019 | 0.0 | 0.0204 | 0.0612 | 0.0812 |
| 2.3042 | 5.0 | 200 | 2.3925 | 0.0604 | 0.1412 | 0.1975 | nan | 0.2435 | 0.0655 | 0.1292 | 0.2869 | 0.0 | 0.0 | 0.0046 | nan | 0.0669 | 0.4894 | 0.1258 | 0.0 | 0.2144 | 0.0516 | 0.1074 | 0.1515 | 0.0 | 0.0 | 0.0017 | 0.0 | 0.0243 | 0.0670 | 0.1063 |
| 2.0869 | 7.5 | 300 | 2.2183 | 0.0932 | 0.1839 | 0.3354 | nan | 0.5208 | 0.0717 | 0.1836 | 0.4192 | 0.0 | 0.0 | 0.0006 | nan | 0.0768 | 0.3608 | 0.2060 | 0.0 | 0.4077 | 0.0617 | 0.1436 | 0.2158 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0358 | 0.0787 | 0.1746 |
| 2.0559 | 10.0 | 400 | 2.0298 | 0.1110 | 0.2055 | 0.3886 | nan | 0.6144 | 0.1027 | 0.1815 | 0.4598 | 0.0 | 0.0 | 0.0005 | nan | 0.0909 | 0.3011 | 0.3041 | 0.0 | 0.4559 | 0.0880 | 0.1400 | 0.2409 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0534 | 0.1001 | 0.2538 |
| 1.9554 | 12.5 | 500 | 1.8871 | 0.1189 | 0.2111 | 0.4100 | nan | 0.6561 | 0.1004 | 0.1647 | 0.4900 | 0.0 | 0.0 | 0.0009 | nan | 0.0763 | 0.2611 | 0.3619 | 0.0 | 0.4739 | 0.0896 | 0.1263 | 0.2616 | 0.0 | 0.0 | 0.0007 | 0.0 | 0.0531 | 0.1207 | 0.3015 |
| 2.0181 | 15.0 | 600 | 1.7720 | 0.1247 | 0.2139 | 0.4199 | nan | 0.6735 | 0.1008 | 0.1723 | 0.4898 | 0.0 | 0.0 | 0.0 | nan | 0.0706 | 0.2349 | 0.3972 | 0.0 | 0.4860 | 0.0912 | 0.1293 | 0.2720 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0532 | 0.1386 | 0.3256 |
| 1.6723 | 17.5 | 700 | 1.7386 | 0.1258 | 0.2129 | 0.4251 | nan | 0.6860 | 0.1011 | 0.1724 | 0.5062 | 0.0 | 0.0 | 0.0 | nan | 0.0615 | 0.2167 | 0.3848 | 0.0 | 0.4927 | 0.0917 | 0.1304 | 0.2814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 0.1426 | 0.3221 |
| 1.5613 | 20.0 | 800 | 1.7751 | 0.1269 | 0.2151 | 0.4322 | nan | 0.7050 | 0.1020 | 0.1730 | 0.5066 | 0.0 | 0.0 | 0.0 | nan | 0.0570 | 0.2288 | 0.3788 | 0.0 | 0.4990 | 0.0927 | 0.1308 | 0.2841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0465 | 0.1502 | 0.3199 |
| 1.5653 | 22.5 | 900 | 1.7222 | 0.1272 | 0.2142 | 0.4277 | nan | 0.6924 | 0.1003 | 0.1794 | 0.5018 | 0.0 | 0.0 | 0.0 | nan | 0.0568 | 0.2295 | 0.3814 | 0.0 | 0.4969 | 0.0914 | 0.1341 | 0.2837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0466 | 0.1523 | 0.3209 |
| 1.5196 | 25.0 | 1000 | 1.7138 | 0.1266 | 0.2136 | 0.4273 | nan | 0.6939 | 0.0982 | 0.1706 | 0.5041 | 0.0 | 0.0 | 0.0 | nan | 0.0557 | 0.2283 | 0.3849 | 0.0 | 0.4965 | 0.0899 | 0.1288 | 0.2845 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0462 | 0.1513 | 0.3225 |
### Framework versions
- Transformers 4.37.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0