|
--- |
|
license: other |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: dropoff-utcustom-train-SF-RGBD-b0_1 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# dropoff-utcustom-train-SF-RGBD-b0_1 |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.4979 |
|
- Mean Iou: 0.4170 |
|
- Mean Accuracy: 0.6846 |
|
- Overall Accuracy: 0.9603 |
|
- Accuracy Unlabeled: nan |
|
- Accuracy Dropoff: 0.3839 |
|
- Accuracy Undropoff: 0.9853 |
|
- Iou Unlabeled: 0.0 |
|
- Iou Dropoff: 0.2914 |
|
- Iou Undropoff: 0.9597 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.05 |
|
- num_epochs: 120 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:| |
|
| 1.0495 | 5.0 | 10 | 1.0890 | 0.1852 | 0.3572 | 0.4990 | nan | 0.2026 | 0.5119 | 0.0 | 0.0474 | 0.5081 | |
|
| 0.9941 | 10.0 | 20 | 1.0479 | 0.3452 | 0.8357 | 0.8479 | nan | 0.8225 | 0.8490 | 0.0 | 0.1931 | 0.8425 | |
|
| 0.9448 | 15.0 | 30 | 0.9839 | 0.3790 | 0.8217 | 0.9010 | nan | 0.7351 | 0.9082 | 0.0 | 0.2390 | 0.8980 | |
|
| 0.8912 | 20.0 | 40 | 0.9041 | 0.3845 | 0.7150 | 0.9247 | nan | 0.4863 | 0.9437 | 0.0 | 0.2303 | 0.9233 | |
|
| 0.8458 | 25.0 | 50 | 0.7997 | 0.3835 | 0.6687 | 0.9326 | nan | 0.3808 | 0.9565 | 0.0 | 0.2188 | 0.9316 | |
|
| 0.8299 | 30.0 | 60 | 0.7387 | 0.3751 | 0.6333 | 0.9326 | nan | 0.3068 | 0.9597 | 0.0 | 0.1934 | 0.9318 | |
|
| 0.7518 | 35.0 | 70 | 0.6810 | 0.3791 | 0.6322 | 0.9404 | nan | 0.2961 | 0.9683 | 0.0 | 0.1975 | 0.9397 | |
|
| 0.6943 | 40.0 | 80 | 0.6322 | 0.3703 | 0.6069 | 0.9422 | nan | 0.2411 | 0.9726 | 0.0 | 0.1691 | 0.9417 | |
|
| 0.6617 | 45.0 | 90 | 0.6071 | 0.3780 | 0.6240 | 0.9454 | nan | 0.2734 | 0.9746 | 0.0 | 0.1892 | 0.9449 | |
|
| 0.634 | 50.0 | 100 | 0.5932 | 0.3765 | 0.6106 | 0.9497 | nan | 0.2407 | 0.9805 | 0.0 | 0.1802 | 0.9494 | |
|
| 0.6157 | 55.0 | 110 | 0.5829 | 0.3982 | 0.6538 | 0.9524 | nan | 0.3281 | 0.9795 | 0.0 | 0.2425 | 0.9520 | |
|
| 0.5814 | 60.0 | 120 | 0.5708 | 0.4038 | 0.6699 | 0.9533 | nan | 0.3608 | 0.9790 | 0.0 | 0.2586 | 0.9528 | |
|
| 0.5988 | 65.0 | 130 | 0.5575 | 0.3974 | 0.6456 | 0.9569 | nan | 0.3061 | 0.9851 | 0.0 | 0.2357 | 0.9564 | |
|
| 0.5583 | 70.0 | 140 | 0.5530 | 0.4224 | 0.7075 | 0.9576 | nan | 0.4346 | 0.9803 | 0.0 | 0.3103 | 0.9570 | |
|
| 0.5596 | 75.0 | 150 | 0.5264 | 0.4034 | 0.6522 | 0.9598 | nan | 0.3167 | 0.9877 | 0.0 | 0.2510 | 0.9593 | |
|
| 0.5524 | 80.0 | 160 | 0.5392 | 0.4208 | 0.7109 | 0.9567 | nan | 0.4429 | 0.9790 | 0.0 | 0.3065 | 0.9560 | |
|
| 0.5294 | 85.0 | 170 | 0.5257 | 0.4161 | 0.6913 | 0.9582 | nan | 0.4002 | 0.9824 | 0.0 | 0.2909 | 0.9576 | |
|
| 0.5477 | 90.0 | 180 | 0.5178 | 0.4207 | 0.6962 | 0.9591 | nan | 0.4095 | 0.9829 | 0.0 | 0.3035 | 0.9584 | |
|
| 0.528 | 95.0 | 190 | 0.5185 | 0.4183 | 0.6939 | 0.9590 | nan | 0.4047 | 0.9831 | 0.0 | 0.2965 | 0.9584 | |
|
| 0.5144 | 100.0 | 200 | 0.5004 | 0.4153 | 0.6788 | 0.9604 | nan | 0.3716 | 0.9860 | 0.0 | 0.2859 | 0.9599 | |
|
| 0.5313 | 105.0 | 210 | 0.5032 | 0.4199 | 0.7005 | 0.9585 | nan | 0.4191 | 0.9819 | 0.0 | 0.3020 | 0.9578 | |
|
| 0.5172 | 110.0 | 220 | 0.4993 | 0.4188 | 0.6931 | 0.9591 | nan | 0.4030 | 0.9832 | 0.0 | 0.2978 | 0.9585 | |
|
| 0.5124 | 115.0 | 230 | 0.4999 | 0.4167 | 0.6828 | 0.9606 | nan | 0.3799 | 0.9858 | 0.0 | 0.2901 | 0.9600 | |
|
| 0.5025 | 120.0 | 240 | 0.4979 | 0.4170 | 0.6846 | 0.9603 | nan | 0.3839 | 0.9853 | 0.0 | 0.2914 | 0.9597 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.30.2 |
|
- Pytorch 2.0.1+cu117 |
|
- Datasets 2.13.1 |
|
- Tokenizers 0.13.3 |
|
|