update model card README.md
Browse files
README.md
ADDED
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: other
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
datasets:
|
6 |
+
- scene_parse_150
|
7 |
+
model-index:
|
8 |
+
- name: nommis_segformer5
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
13 |
+
should probably proofread and complete it, then remove this comment. -->
|
14 |
+
|
15 |
+
# nommis_segformer5
|
16 |
+
|
17 |
+
This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on the scene_parse_150 dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 0.4968
|
20 |
+
- Mean Iou: 0.3573
|
21 |
+
- Mean Accuracy: 0.4923
|
22 |
+
- Overall Accuracy: 0.8411
|
23 |
+
- Per Category Iou: [0.74975317392031, 0.7794410147887274, 0.9477603899927828, 0.7656932606023603, 0.723857664111405, 0.8716512692421345, 0.863022621721695, 0.9240294561369442, 0.7242633568910231, 0.6134294305818301, 0.48159993522529454, 0.6128290042426419, 0.7376616340096556, 0.14249931825756734, 0.35176633571923344, 0.5620357131338759, 0.8638289260658392, 0.415410472072499, 0.5517729659566936, 0.6172647443104129, 0.8124566178707076, 0.0, 0.6428092209430266, 0.0, 0.2104813929668829, 0.4251821264997899, 0.9380425651078129, 0.8474509980895648, 0.5821050652926282, 0.4506812149650353, 0.6943803190572994, 0.0, 0.17265479960685814, 0.23995127892813642, 0.0, 0.0, 0.2724044481824129, 0.0, 0.392571160462934, 0.21584808909542505, 0.2496028068317225, 0.0, 0.09854609688158163, 0.022266401590457258, nan, 0.0, 0.9471716274248482, 0.6674884632404071, nan, 0.8374251999328897, 0.0, 0.49230769230769234, 0.0, 0.4042453273222504, 0.0, nan, nan, 0.5250632076508739, 0.9178690960601052, 0.2001023541453429, nan, nan, 0.5349142857142857, 0.0, 0.0, 0.8767808047988003, 0.6694305040366572, 0.7116823374785158, 0.6645531400966184, 0.0, 0.7045357686453577, nan, 0.33535395047775607, nan, 0.7918331226295828, nan, 0.0, nan, 0.0, 0.10583069982872169, nan, 0.18884178859565157, 0.11131725417439703, nan, nan, 0.617096297708105, 0.5133936387510942, 0.023268853378188955, 0.0, 0.8122001370801919, nan, 0.0, 0.6120563928050559, 0.015886896953796824, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.13397003128602009, nan, 0.6251570200019325, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.2817006400534663, 0.0, 0.0, nan, 0.4339781328847771, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.37268373932958565, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.36669593210418494, 0.0, nan, nan, nan, 0.0009633911368015414, 0.0, 0.029017047515415305]
|
24 |
+
- Per Category Accuracy: [0.8500475032918948, 0.9090727442266007, 0.9790694696013319, 0.9066653475744558, 0.8036125736948979, 0.9619617217520251, 0.9428051445352432, 0.9501946899936256, 0.7538857522417801, 0.66818192208844, 0.6985935343688522, 0.6854295277054667, 0.8803769797226216, 0.33735832815264694, 0.3681730304494925, 0.7064864864864865, 0.9939117530177338, 0.49297531916389165, 0.6516846102923243, 0.7121253549269113, 0.9651162072522219, nan, 0.9039209026755727, 0.0, 0.23541766109785203, 0.9826890059795157, 0.966677789100616, 0.9145306082693463, 0.662943420936646, 0.6884258672110558, 0.9477743551155736, nan, 0.27695728338498854, 0.6679707876890976, 0.0, nan, 0.6349379220213461, nan, 0.41777903531839816, 0.21584808909542505, 0.3558586192251427, 0.0, 0.09857623432668457, 0.022798208712172616, nan, nan, 0.9908985282726569, 0.7127235909551131, nan, 0.884569943289225, nan, 0.494096872882574, 0.0, 0.4246669955599408, nan, nan, nan, 0.9346443596516975, 0.9835706951203179, 0.22040586245772267, nan, nan, 0.665907878356749, nan, 0.0, 0.9194746192154236, 0.713322483143455, 0.8259364641694289, 0.7627227722772277, 0.0, 0.9854381333560419, nan, 0.35627353599301426, nan, 0.9498058959116826, nan, 0.0, nan, nan, 0.13212661506490378, nan, 0.47752420470262796, 0.12829650748396293, nan, nan, 0.7659234727068425, 0.5280946091967823, 0.027880416526704737, nan, 0.8304134548002803, nan, nan, 0.9665496819477956, 0.016129032258064516, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.13411578530924437, nan, 0.6578626264680462, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.2817223650385604, nan, 0.0, nan, 0.48074534161490684, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.463010863942059, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.3707466219548279, 0.0, nan, nan, nan, 0.0009633911368015414, 0.0, 0.029017047515415305]
|
25 |
+
|
26 |
+
## Model description
|
27 |
+
|
28 |
+
More information needed
|
29 |
+
|
30 |
+
## Intended uses & limitations
|
31 |
+
|
32 |
+
More information needed
|
33 |
+
|
34 |
+
## Training and evaluation data
|
35 |
+
|
36 |
+
More information needed
|
37 |
+
|
38 |
+
## Training procedure
|
39 |
+
|
40 |
+
### Training hyperparameters
|
41 |
+
|
42 |
+
The following hyperparameters were used during training:
|
43 |
+
- learning_rate: 6e-05
|
44 |
+
- train_batch_size: 2
|
45 |
+
- eval_batch_size: 2
|
46 |
+
- seed: 42
|
47 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
48 |
+
- lr_scheduler_type: linear
|
49 |
+
- num_epochs: 10
|
50 |
+
|
51 |
+
### Training results
|
52 |
+
|
53 |
+
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|
54 |
+
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
|
55 |
+
| 0.2686 | 10.0 | 800 | 0.4968 | 0.3573 | 0.4923 | 0.8411 | [0.74975317392031, 0.7794410147887274, 0.9477603899927828, 0.7656932606023603, 0.723857664111405, 0.8716512692421345, 0.863022621721695, 0.9240294561369442, 0.7242633568910231, 0.6134294305818301, 0.48159993522529454, 0.6128290042426419, 0.7376616340096556, 0.14249931825756734, 0.35176633571923344, 0.5620357131338759, 0.8638289260658392, 0.415410472072499, 0.5517729659566936, 0.6172647443104129, 0.8124566178707076, 0.0, 0.6428092209430266, 0.0, 0.2104813929668829, 0.4251821264997899, 0.9380425651078129, 0.8474509980895648, 0.5821050652926282, 0.4506812149650353, 0.6943803190572994, 0.0, 0.17265479960685814, 0.23995127892813642, 0.0, 0.0, 0.2724044481824129, 0.0, 0.392571160462934, 0.21584808909542505, 0.2496028068317225, 0.0, 0.09854609688158163, 0.022266401590457258, nan, 0.0, 0.9471716274248482, 0.6674884632404071, nan, 0.8374251999328897, 0.0, 0.49230769230769234, 0.0, 0.4042453273222504, 0.0, nan, nan, 0.5250632076508739, 0.9178690960601052, 0.2001023541453429, nan, nan, 0.5349142857142857, 0.0, 0.0, 0.8767808047988003, 0.6694305040366572, 0.7116823374785158, 0.6645531400966184, 0.0, 0.7045357686453577, nan, 0.33535395047775607, nan, 0.7918331226295828, nan, 0.0, nan, 0.0, 0.10583069982872169, nan, 0.18884178859565157, 0.11131725417439703, nan, nan, 0.617096297708105, 0.5133936387510942, 0.023268853378188955, 0.0, 0.8122001370801919, nan, 0.0, 0.6120563928050559, 0.015886896953796824, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.13397003128602009, nan, 0.6251570200019325, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.2817006400534663, 0.0, 0.0, nan, 0.4339781328847771, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.37268373932958565, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.36669593210418494, 0.0, nan, nan, nan, 0.0009633911368015414, 0.0, 0.029017047515415305] | [0.8500475032918948, 0.9090727442266007, 0.9790694696013319, 0.9066653475744558, 0.8036125736948979, 0.9619617217520251, 0.9428051445352432, 0.9501946899936256, 0.7538857522417801, 0.66818192208844, 0.6985935343688522, 0.6854295277054667, 0.8803769797226216, 0.33735832815264694, 0.3681730304494925, 0.7064864864864865, 0.9939117530177338, 0.49297531916389165, 0.6516846102923243, 0.7121253549269113, 0.9651162072522219, nan, 0.9039209026755727, 0.0, 0.23541766109785203, 0.9826890059795157, 0.966677789100616, 0.9145306082693463, 0.662943420936646, 0.6884258672110558, 0.9477743551155736, nan, 0.27695728338498854, 0.6679707876890976, 0.0, nan, 0.6349379220213461, nan, 0.41777903531839816, 0.21584808909542505, 0.3558586192251427, 0.0, 0.09857623432668457, 0.022798208712172616, nan, nan, 0.9908985282726569, 0.7127235909551131, nan, 0.884569943289225, nan, 0.494096872882574, 0.0, 0.4246669955599408, nan, nan, nan, 0.9346443596516975, 0.9835706951203179, 0.22040586245772267, nan, nan, 0.665907878356749, nan, 0.0, 0.9194746192154236, 0.713322483143455, 0.8259364641694289, 0.7627227722772277, 0.0, 0.9854381333560419, nan, 0.35627353599301426, nan, 0.9498058959116826, nan, 0.0, nan, nan, 0.13212661506490378, nan, 0.47752420470262796, 0.12829650748396293, nan, nan, 0.7659234727068425, 0.5280946091967823, 0.027880416526704737, nan, 0.8304134548002803, nan, nan, 0.9665496819477956, 0.016129032258064516, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.13411578530924437, nan, 0.6578626264680462, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.2817223650385604, nan, 0.0, nan, 0.48074534161490684, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.463010863942059, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.3707466219548279, 0.0, nan, nan, nan, 0.0009633911368015414, 0.0, 0.029017047515415305] |
|
56 |
+
|
57 |
+
|
58 |
+
### Framework versions
|
59 |
+
|
60 |
+
- Transformers 4.28.0
|
61 |
+
- Pytorch 2.0.1+cu118
|
62 |
+
- Datasets 2.14.5
|
63 |
+
- Tokenizers 0.13.3
|