gabryland commited on
Commit
57b1aad
1 Parent(s): 63d5e40

End of training

Browse files
Files changed (2) hide show
  1. README.md +70 -0
  2. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ base_model: google/deeplabv3_mobilenet_v2_1.0_513
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: lpcv_seg
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # lpcv_seg
15
+
16
+ This model is a fine-tuned version of [google/deeplabv3_mobilenet_v2_1.0_513](https://huggingface.co/google/deeplabv3_mobilenet_v2_1.0_513) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.7747
19
+ - Mean Iou: 0.3647
20
+ - Mean Accuracy: 0.4742
21
+ - Overall Accuracy: 0.7441
22
+ - Per Category Iou: [0.7508396000072782, 0.461963906773346, 0.41562431632163865, 0.2643890336606752, 0.20882280410355394, 0.21001420486640948, 0.45776923048905305, 0.6263265430221951, 0.5990132199881534, nan, 0.0, 0.4389678627683, 0.30687238462719907, 0.0]
23
+ - Per Category Accuracy: [0.8794950417196551, 0.5122587212045141, 0.47963636761360323, 0.28307093261894156, 0.22726847453969443, 0.817351469679542, 0.5476940642254209, 0.672940072704085, 0.8955934160757268, nan, 0.0, 0.44608425759467085, 0.40330970104326763, 0.0]
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 6e-05
43
+ - train_batch_size: 8
44
+ - eval_batch_size: 8
45
+ - seed: 42
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - num_epochs: 30
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
54
+ | 1.0299 | 3.12 | 400 | 1.1445 | 0.1503 | 0.2293 | 0.6846 | [0.7016834423401825, 0.0, 0.28392253175773663, 0.09630649211770352, 0.16489618177561766, 0.05923195228949304, 0.0, 0.47146079872020313, 0.0, nan, 0.0, 0.002537363927640351, 0.17407988046100897, 0.0] | [0.891422563505799, 0.0, 0.7160794493379222, 0.101650104427665, 0.20155241518296468, 0.08566222638071554, 0.0, 0.7801995050268263, 0.0, nan, 0.0, 0.002538359085235112, 0.20133152963007558, 0.0] |
55
+ | 1.2946 | 6.25 | 800 | 0.9463 | 0.1514 | 0.2217 | 0.6990 | [0.7200410800543089, 0.025577984693804623, 0.272186783532416, 0.03900872477465889, 0.1420562982408946, 0.13672818472813886, 0.0, 0.3747836709762802, 1.4597782434650593e-05, nan, 0.0, 0.0, 0.2576138826730331, 0.0] | [0.9197800777719178, 0.025581056237151574, 0.3936555737035525, 0.03917595636589459, 0.1525480793198657, 0.16772954472705784, 0.0, 0.8514708072975876, 1.4597782434650593e-05, nan, 0.0, 0.0, 0.33275562889985905, 0.0] |
56
+ | 0.8869 | 9.38 | 1200 | 0.9005 | 0.2553 | 0.3463 | 0.7230 | [0.7311911080696011, 0.1716353741804461, 0.32989770226558307, 0.2484733739307282, 0.20678309581325105, 0.16925674004774066, 0.026560396219635213, 0.43309262889720945, 0.4791539954039438, nan, 0.0, 0.1996331281686418, 0.3234492318785704, 0.0] | [0.8817514590505282, 0.1734977169471996, 0.5536498133348888, 0.26666168042981364, 0.2199222972903758, 0.30495278898499545, 0.026735171458080034, 0.8794234634009936, 0.5647817144933271, nan, 0.0, 0.19998287851911137, 0.43016977002783463, 0.0] |
57
+ | 0.8056 | 12.5 | 1600 | 0.7305 | 0.3106 | 0.3897 | 0.7595 | [0.7538928589848243, 0.25918801874572633, 0.4203485338620474, 0.2917137053557758, 0.22289503658523782, 0.1500749815626878, 0.08861529960760275, 0.624086456268012, 0.48451939021185103, nan, 0.0, 0.3968018568653773, 0.34541558718052773, 0.0] | [0.9269385933729979, 0.2666068145623601, 0.5690834086799277, 0.323866282726429, 0.23199974868520706, 0.27844356662715314, 0.08945714746397179, 0.7853752936779852, 0.7445501612243971, nan, 0.0, 0.4213536720599331, 0.42861443009823574, 0.0] |
58
+ | 0.7534 | 15.62 | 2000 | 0.8091 | 0.2762 | 0.3774 | 0.7408 | [0.7450525081446471, 0.08805507548598117, 0.39296675038083556, 0.17888143033371248, 0.23466648108570068, 0.16079006682931019, 0.03714210117676656, 0.6219865494576018, 0.5133743295405404, nan, 0.0, 0.25318448016278294, 0.3644475822818699, 0.0] | [0.8925789949539187, 0.08813412970942185, 0.6256571559820335, 0.18979883934004887, 0.24639981315291482, 0.4060874245051725, 0.03731593695118816, 0.8003825980603533, 0.8838065177476595, nan, 0.0, 0.25812424222515545, 0.47821461707899227, 0.0] |
59
+ | 0.909 | 18.75 | 2400 | 0.7351 | 0.3258 | 0.4109 | 0.7698 | [0.7635658546626002, 0.40487164496967293, 0.3833268397798514, 0.2752089531995694, 0.2867291055853458, 0.019354003421208514, 0.16890635042927862, 0.5748136637050552, 0.6549805047839075, nan, 0.0, 0.41018196642738697, 0.29410209406700305, 0.0] | [0.936836241285118, 0.4435412622122596, 0.6115797993350055, 0.29433416172859234, 0.3084226650431367, 0.01980599690692518, 0.1729223045848531, 0.8637196946532967, 0.9040471540812156, nan, 0.0, 0.44469064868513, 0.3419148529297193, 0.0] |
60
+ | 0.6622 | 21.88 | 2800 | 0.6963 | 0.3398 | 0.3986 | 0.7827 | [0.7663761109958838, 0.1332027535954636, 0.4428707076945945, 0.35775923195424675, 0.3733786220955952, 0.24949901778334058, 0.07535404639899677, 0.6174965473057359, 0.5443824880761584, nan, 0.0, 0.4962856002887922, 0.3608184550398603, 0.0] | [0.9493652245312661, 0.13338632341294235, 0.6300075103540804, 0.4173944872618647, 0.41213577281689057, 0.31108969330720987, 0.0759392172499597, 0.6593100168039995, 0.5955570838172228, nan, 0.0, 0.5200351189445205, 0.4777004199984418, 0.0] |
61
+ | 1.1202 | 25.0 | 3200 | 1.6352 | 0.1684 | 0.2322 | 0.7026 | [0.6963172586859363, 0.011559652438449069, 0.22669839158942193, 0.05922256175168542, 0.06532726727054458, 0.0023358067869460536, 0.0, 0.509361415684211, 0.38727970470408707, nan, 0.0, 0.09125458640751972, 0.13993367687629527, 0.0] | [0.9487412199772098, 0.011559652438449069, 0.24983867321938985, 0.05932148648786475, 0.06540672362680026, 0.003050864548414594, 0.0, 0.7020197466594538, 0.7380590139684559, nan, 0.0, 0.09130527401337467, 0.14901587211649467, 0.0] |
62
+ | 0.6529 | 28.12 | 3600 | 0.7747 | 0.3647 | 0.4742 | 0.7441 | [0.7508396000072782, 0.461963906773346, 0.41562431632163865, 0.2643890336606752, 0.20882280410355394, 0.21001420486640948, 0.45776923048905305, 0.6263265430221951, 0.5990132199881534, nan, 0.0, 0.4389678627683, 0.30687238462719907, 0.0] | [0.8794950417196551, 0.5122587212045141, 0.47963636761360323, 0.28307093261894156, 0.22726847453969443, 0.817351469679542, 0.5476940642254209, 0.672940072704085, 0.8955934160757268, nan, 0.0, 0.44608425759467085, 0.40330970104326763, 0.0] |
63
+
64
+
65
+ ### Framework versions
66
+
67
+ - Transformers 4.32.1
68
+ - Pytorch 2.0.1+cu118
69
+ - Datasets 2.14.4
70
+ - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:aefb7cd12a846666f1c912ffbd659bef3ba97936aeb7314f8a6bfe14f597e7f8
3
  size 10353021
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c5394e3db7ac90e8a93e668844772cb54ccd89968bd56f0192bf79d60c1530a9
3
  size 10353021