manadopeee commited on
Commit
3dfa79f
1 Parent(s): f9839e3

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +67 -0
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - scene_parse_150
7
+ model-index:
8
+ - name: segformer-b0-scene-parse-150_epoch_100_230609
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # segformer-b0-scene-parse-150_epoch_100_230609
16
+
17
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 2.7126
20
+ - Mean Iou: 0.1053
21
+ - Mean Accuracy: 0.1994
22
+ - Overall Accuracy: 0.5447
23
+ - Per Category Iou: [0.48741983413436024, 0.34708122936068353, 0.8494644532893246, 0.3618389507826823, 0.016919144195669256, 0.746579767268802, 0.0, 0.4008814740204453, 0.26432782122527576, 0.0, 0.0, 0.2358305940560507, 0.13905866374131537, nan, 0.0, 0.0, 0.5318380393695908, 0.0, 0.0, 0.041298586572438165, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
24
+ - Per Category Accuracy: [0.7865618692274757, 0.9652097859624402, 0.9908729919072352, 0.5594874236350619, 0.12989690721649486, 0.8943671630094044, nan, 0.8825049920983964, 0.29573472254593786, nan, 0.0, 0.9468519337392428, 0.16706413957574998, nan, 0.0, 0.0, 0.5378679869020947, 0.0, 0.0, 0.21969845310358332, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 6e-05
44
+ - train_batch_size: 8
45
+ - eval_batch_size: 8
46
+ - seed: 42
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - num_epochs: 100
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
55
+ | 3.335 | 20.0 | 100 | 3.5913 | 0.0958 | 0.1968 | 0.4914 | [0.4372210968359756, 0.3028306951772656, 0.9033017061947888, 0.3690449269582307, 0.05890453885736904, 0.521817339647163, 0.0, 0.3631349261471501, 0.05912798485639358, nan, 0.0, 0.23295937758137303, 0.12080500701413618, 0.0, 0.0, 0.0, 0.40666846895557357, 0.0, 0.0, 0.15182824063896827, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7828715313882603, 0.9935011297101626, 0.9796020050730765, 0.6524082439607645, 0.5454753722794959, 0.5683581504702194, nan, 0.7686116453711304, 0.05922631608786308, nan, 0.0, 0.9736725738970713, 0.14051713317434417, nan, 0.0, 0.0, 0.41243778756116795, 0.0, 0.0, 0.40571764245153713, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
56
+ | 2.2088 | 40.0 | 200 | 2.9755 | 0.1102 | 0.2011 | 0.5560 | [0.45422192073986317, 0.3436668041953486, 0.8903445028964444, 0.36640300640210627, 0.08482177830003917, 0.696578291411738, 0.0, 0.3924824887368871, 0.1146148769912978, nan, 0.0, 0.2583488263193765, 0.09984717269485481, nan, 0.0, 0.0, 0.6613259967945657, 0.0, 0.0, 0.044113233970191054, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7754602727995596, 0.9612375792619227, 0.9850827394612875, 0.5917451364483808, 0.3307369224894998, 0.9034090909090909, nan, 0.8772617574636465, 0.11706677921472532, nan, 0.0, 0.9477023027994149, 0.11070666499309652, nan, 0.0, 0.0, 0.6691055509423911, 0.0, 0.0, 0.17270413158410025, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
57
+ | 1.8764 | 60.0 | 300 | 2.8496 | 0.1046 | 0.1910 | 0.5299 | [0.44823292205691717, 0.3374611910810048, 0.8521673994463442, 0.36771300448430494, 0.011525925925925926, 0.6769752103220841, nan, 0.4127585356400409, 0.19237793012603657, nan, 0.0, 0.23536215301960003, 0.10166928075285682, nan, 0.0, 0.0, 0.502039728794969, 0.0, 0.0, 0.044836210577685595, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.00027059937762143147, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7515036597549289, 0.9295206627632954, 0.9876570237951443, 0.5829066103598283, 0.08911798396334479, 0.9305789576802508, nan, 0.8610542821791275, 0.20239752562253144, nan, 0.0, 0.9655940678254362, 0.1139073678925568, nan, 0.0, 0.0, 0.5067709067740402, 0.0, 0.0, 0.14392010965341687, nan, nan, nan, 0.0, nan, nan, nan, 0.0002748763056624519, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
58
+ | 1.6882 | 80.0 | 400 | 2.6676 | 0.1123 | 0.2036 | 0.5699 | [0.4826916675912571, 0.35289291208668705, 0.8613952449463594, 0.3690071358526864, 0.04114119410882794, 0.7420633159137224, 0.0, 0.39243581224605395, 0.26480929728158487, nan, 0.0, 0.242911210420564, 0.12443874278383579, nan, 0.0, 0.0, 0.6824408307674852, 0.0, 0.0, 0.04806344199088679, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7805655799539217, 0.9613226112096402, 0.9898462978620607, 0.573010513921042, 0.1424971363115693, 0.9007004310344827, nan, 0.8826764997733648, 0.29841193849332587, nan, 0.0, 0.9540290486070955, 0.14610267352830425, nan, 0.0, 0.0, 0.6910372308479692, 0.0, 0.0, 0.21480321127863716, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
59
+ | 1.9454 | 100.0 | 500 | 2.7126 | 0.1053 | 0.1994 | 0.5447 | [0.48741983413436024, 0.34708122936068353, 0.8494644532893246, 0.3618389507826823, 0.016919144195669256, 0.746579767268802, 0.0, 0.4008814740204453, 0.26432782122527576, 0.0, 0.0, 0.2358305940560507, 0.13905866374131537, nan, 0.0, 0.0, 0.5318380393695908, 0.0, 0.0, 0.041298586572438165, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7865618692274757, 0.9652097859624402, 0.9908729919072352, 0.5594874236350619, 0.12989690721649486, 0.8943671630094044, nan, 0.8825049920983964, 0.29573472254593786, nan, 0.0, 0.9468519337392428, 0.16706413957574998, nan, 0.0, 0.0, 0.5378679869020947, 0.0, 0.0, 0.21969845310358332, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
60
+
61
+
62
+ ### Framework versions
63
+
64
+ - Transformers 4.29.2
65
+ - Pytorch 2.0.1
66
+ - Datasets 2.12.0
67
+ - Tokenizers 0.13.3