File size: 9,407 Bytes
5f29edb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-drone-real
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-drone-real
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5066
- Mean Iou: 0.0277
- Mean Accuracy: 0.0561
- Overall Accuracy: 0.3272
- Per Category Iou: [nan, 0.3359579661339679, 0.0, 0.20517716653585355, 0.003702068923287315, 0.0, 0.0, 0.0, 0.006312134342948017, 0.058787282412541476, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan]
- Per Category Accuracy: [nan, 0.6093986043791881, 0.0, 0.4105641817321649, 0.003921086446309355, 0.0, 0.0, 0.0, 0.00706877745310955, 0.20348775859354865, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 2.8285 | 2.0 | 40 | 2.8542 | 0.0229 | 0.0536 | 0.2211 | [nan, 0.23060686675928652, 3.3649639948852546e-05, 0.16273692278074856, 0.013325250843685254, 4.063223761732559e-05, 0.0, 0.0, 0.035811860573766785, 0.06198642655834834, 0.0, 0.0, 6.128953174797744e-05, 0.0, 0.0, 0.0, 5.1148278860416345e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.33385338422824606, 3.373169528473065e-05, 0.299629413956809, 0.014924989967585504, 4.09158863797309e-05, 0.0, 0.0, 0.06156816279101685, 0.46948081490592114, 0.0, 0.0, 7.101768340316739e-05, 0.0, 0.0, 0.0, 5.163155720776539e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.6567 | 4.0 | 80 | 2.6894 | 0.0262 | 0.0563 | 0.2866 | [nan, 0.30624516924744655, 0.0, 0.18452743224435408, 0.004815305953306897, 0.0, 0.0, 0.0, 0.013916000286575587, 0.06596235864669749, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.49410241551371004, 0.0, 0.3789267923602906, 0.005102681931249924, 0.0, 0.0, 0.0, 0.016502745882809654, 0.3439400151190408, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.3286 | 6.0 | 120 | 2.5923 | 0.0267 | 0.0560 | 0.3328 | [nan, 0.35729772739055154, 0.0, 0.1539404924138176, 0.0035934817901112264, 0.0, 0.0, 0.0, 0.005766957703680072, 0.06586525335295575, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.7280486819078453, 0.0, 0.23220288304265976, 0.003859608350752127, 0.0, 0.0, 0.0, 0.006166880921555261, 0.2614486947285752, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.3836 | 8.0 | 160 | 2.5890 | 0.0275 | 0.0575 | 0.3244 | [nan, 0.349601471481378, 0.0, 0.17496917223449057, 0.0044570694472161245, 0.0, 0.0, 0.0, 0.007437757319012064, 0.0682080200846761, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.6614103864255038, 0.0, 0.29170107802854334, 0.004828395043379214, 0.0, 0.0, 0.0, 0.008074588150255963, 0.29956113264795986, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.2849 | 10.0 | 200 | 2.5525 | 0.0283 | 0.0569 | 0.3388 | [nan, 0.3351423574874955, 0.0, 0.21293393590619594, 0.002350142484175345, 0.0, 0.0, 0.0, 0.005453996034797477, 0.06749403988523864, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.6106816065491506, 0.0, 0.4703984171366385, 0.002485471577527932, 0.0, 0.0, 0.0, 0.0058035081160884615, 0.16273986773434082, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.1539 | 12.0 | 240 | 2.5352 | 0.0289 | 0.0579 | 0.3431 | [nan, 0.34790380657207765, 0.0, 0.21813897765519252, 0.0039179488370322925, 0.0, 0.0, 0.0, 0.004069615148127275, 0.06184727957512016, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.6213138456628569, 0.0, 0.4695635471053968, 0.004103493982578053, 0.0, 0.0, 0.0, 0.004355898828122703, 0.17384734612397268, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.1894 | 14.0 | 280 | 2.5923 | 0.0271 | 0.0567 | 0.3056 | [nan, 0.3019300415296601, 0.0, 0.2108717486241356, 0.003421017543191299, 0.0, 0.0, 0.0, 0.012217783066499338, 0.06721066611554333, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.4912292097745947, 0.0, 0.49141307925385636, 0.0036400437237620272, 0.0, 0.0, 0.0, 0.01400423107111977, 0.2475275870599001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 1.9499 | 16.0 | 320 | 2.5169 | 0.0282 | 0.0569 | 0.3320 | [nan, 0.33589786653498954, 0.0, 0.20789931305652776, 0.004207056605876958, 0.0, 0.0, 0.0, 0.0075080990386678535, 0.06424320004285503, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.613616488775929, 0.0, 0.4253884604222434, 0.0044872253922099824, 0.0, 0.0, 0.0, 0.008369991941750555, 0.19915876549744932, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.115 | 18.0 | 360 | 2.5153 | 0.0277 | 0.0562 | 0.3256 | [nan, 0.3342799309383031, 0.0, 0.2066227292961174, 0.0034518271833141106, 0.0, 0.0, 0.0, 0.006799912048482982, 0.05915124911379064, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.5983876453859164, 0.0, 0.4204451510267332, 0.0036636891451301916, 0.0, 0.0, 0.0, 0.007619718595211191, 0.20718275529815128, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
| 2.0119 | 20.0 | 400 | 2.5066 | 0.0277 | 0.0561 | 0.3272 | [nan, 0.3359579661339679, 0.0, 0.20517716653585355, 0.003702068923287315, 0.0, 0.0, 0.0, 0.006312134342948017, 0.058787282412541476, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] | [nan, 0.6093986043791881, 0.0, 0.4105641817321649, 0.003921086446309355, 0.0, 0.0, 0.0, 0.00706877745310955, 0.20348775859354865, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan] |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.12.1+cu113
- Datasets 2.7.1
- Tokenizers 0.13.2
|