sam1120 commited on
Commit
2db175b
1 Parent(s): 18735cd

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +89 -0
README.md ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: dropoff-utcustom-train-SF-RGBD-b5_4
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # dropoff-utcustom-train-SF-RGBD-b5_4
14
+
15
+ This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.2351
18
+ - Mean Iou: 0.4792
19
+ - Mean Accuracy: 0.5
20
+ - Overall Accuracy: 0.9584
21
+ - Accuracy Unlabeled: nan
22
+ - Accuracy Dropoff: 0.0
23
+ - Accuracy Undropoff: 1.0
24
+ - Iou Unlabeled: nan
25
+ - Iou Dropoff: 0.0
26
+ - Iou Undropoff: 0.9584
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 7e-06
46
+ - train_batch_size: 16
47
+ - eval_batch_size: 16
48
+ - seed: 42
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_ratio: 0.05
52
+ - num_epochs: 120
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Dropoff | Accuracy Undropoff | Iou Unlabeled | Iou Dropoff | Iou Undropoff |
57
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:----------------:|:------------------:|:-------------:|:-----------:|:-------------:|
58
+ | 1.0114 | 5.0 | 10 | 1.0037 | 0.2459 | 0.4345 | 0.7074 | nan | 0.1368 | 0.7322 | 0.0 | 0.0286 | 0.7089 |
59
+ | 0.9088 | 10.0 | 20 | 0.8245 | 0.3119 | 0.5046 | 0.8887 | nan | 0.0857 | 0.9235 | 0.0 | 0.0460 | 0.8897 |
60
+ | 0.8029 | 15.0 | 30 | 0.6620 | 0.3157 | 0.4998 | 0.9214 | nan | 0.0399 | 0.9596 | 0.0 | 0.0253 | 0.9219 |
61
+ | 0.6935 | 20.0 | 40 | 0.5662 | 0.3154 | 0.4959 | 0.9309 | nan | 0.0214 | 0.9704 | 0.0 | 0.0151 | 0.9311 |
62
+ | 0.635 | 25.0 | 50 | 0.5018 | 0.3175 | 0.4978 | 0.9401 | nan | 0.0153 | 0.9803 | 0.0 | 0.0121 | 0.9404 |
63
+ | 0.5579 | 30.0 | 60 | 0.4701 | 0.3178 | 0.4978 | 0.9422 | nan | 0.0131 | 0.9825 | 0.0 | 0.0111 | 0.9423 |
64
+ | 0.5086 | 35.0 | 70 | 0.4403 | 0.3181 | 0.4977 | 0.9459 | nan | 0.0088 | 0.9866 | 0.0 | 0.0080 | 0.9461 |
65
+ | 0.472 | 40.0 | 80 | 0.4328 | 0.3177 | 0.4971 | 0.9471 | nan | 0.0063 | 0.9879 | 0.0 | 0.0059 | 0.9473 |
66
+ | 0.4484 | 45.0 | 90 | 0.4136 | 0.3184 | 0.4981 | 0.9506 | nan | 0.0046 | 0.9916 | 0.0 | 0.0044 | 0.9508 |
67
+ | 0.4026 | 50.0 | 100 | 0.4013 | 0.3186 | 0.4985 | 0.9516 | nan | 0.0043 | 0.9926 | 0.0 | 0.0042 | 0.9517 |
68
+ | 0.3873 | 55.0 | 110 | 0.3621 | 0.3189 | 0.4991 | 0.9557 | nan | 0.0010 | 0.9971 | 0.0 | 0.0009 | 0.9557 |
69
+ | 0.3549 | 60.0 | 120 | 0.3479 | 0.3189 | 0.4992 | 0.9564 | nan | 0.0004 | 0.9979 | 0.0 | 0.0004 | 0.9564 |
70
+ | 0.3358 | 65.0 | 130 | 0.3282 | 0.3191 | 0.4994 | 0.9571 | nan | 0.0001 | 0.9986 | 0.0 | 0.0001 | 0.9571 |
71
+ | 0.3146 | 70.0 | 140 | 0.3141 | 0.3193 | 0.4996 | 0.9577 | nan | 0.0000 | 0.9993 | 0.0 | 0.0000 | 0.9577 |
72
+ | 0.3116 | 75.0 | 150 | 0.2941 | 0.3194 | 0.4999 | 0.9582 | nan | 0.0 | 0.9998 | 0.0 | 0.0 | 0.9582 |
73
+ | 0.3151 | 80.0 | 160 | 0.2809 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 0.9999 | 0.0 | 0.0 | 0.9584 |
74
+ | 0.2778 | 85.0 | 170 | 0.2750 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 1.0000 | 0.0 | 0.0 | 0.9584 |
75
+ | 0.2753 | 90.0 | 180 | 0.2615 | 0.3195 | 0.5000 | 0.9584 | nan | 0.0 | 1.0000 | 0.0 | 0.0 | 0.9584 |
76
+ | 0.2809 | 95.0 | 190 | 0.2547 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
77
+ | 0.2606 | 100.0 | 200 | 0.2464 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
78
+ | 0.2563 | 105.0 | 210 | 0.2459 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
79
+ | 0.2454 | 110.0 | 220 | 0.2393 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
80
+ | 0.2707 | 115.0 | 230 | 0.2368 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
81
+ | 0.2433 | 120.0 | 240 | 0.2351 | 0.4792 | 0.5 | 0.9584 | nan | 0.0 | 1.0 | nan | 0.0 | 0.9584 |
82
+
83
+
84
+ ### Framework versions
85
+
86
+ - Transformers 4.30.2
87
+ - Pytorch 2.0.1+cu117
88
+ - Datasets 2.13.1
89
+ - Tokenizers 0.13.3