amanvip2 commited on
Commit
b88f8e1
1 Parent(s): 44d926c

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +45 -21
README.md CHANGED
@@ -16,10 +16,8 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Train Accuracy: 0.8288
20
- - Train Loss: 0.0140
21
- - Accuracy: 0.8797
22
- - Loss: 0.0111
23
 
24
  ## Model description
25
 
@@ -44,28 +42,54 @@ The following hyperparameters were used during training:
44
  - seed: 42
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - num_epochs: 2
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Accuracy | Loss | Validation Loss |
52
  |:-------------:|:-----:|:----:|:--------:|:------:|:---------------:|
53
- | No log | 0.25 | 10 | 0.6456 | 0.0227 | 0.0208 |
54
- | No log | 0.25 | 10 | 0.6456 | 0.0227 | 0.0208 |
55
- | No log | 0.5 | 20 | 0.7563 | 0.0203 | 0.0193 |
56
- | No log | 0.5 | 20 | 0.7563 | 0.0203 | 0.0193 |
57
- | No log | 0.75 | 30 | 0.6994 | 0.0191 | 0.0168 |
58
- | No log | 0.75 | 30 | 0.6994 | 0.0191 | 0.0168 |
59
- | No log | 1.0 | 40 | 0.7595 | 0.0185 | 0.0166 |
60
- | No log | 1.0 | 40 | 0.7595 | 0.0185 | 0.0166 |
61
- | No log | 1.25 | 50 | 0.8797 | 0.0156 | 0.0121 |
62
- | No log | 1.25 | 50 | 0.8797 | 0.0156 | 0.0121 |
63
- | No log | 1.5 | 60 | 0.8639 | 0.0152 | 0.0114 |
64
- | No log | 1.5 | 60 | 0.8639 | 0.0152 | 0.0114 |
65
- | No log | 1.75 | 70 | 0.8671 | 0.0150 | 0.0115 |
66
- | No log | 1.75 | 70 | 0.8671 | 0.0150 | 0.0115 |
67
- | No log | 2.0 | 80 | 0.8797 | 0.0140 | 0.0111 |
68
- | No log | 2.0 | 80 | 0.8797 | 0.0140 | 0.0111 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
69
 
70
 
71
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.2529
20
+ - Accuracy: 0.9525
 
 
21
 
22
  ## Model description
23
 
 
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 50
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Accuracy | Loss | Validation Loss |
50
  |:-------------:|:-----:|:----:|:--------:|:------:|:---------------:|
51
+ | No log | 2.5 | 100 | 0.8924 | 0.0151 | 0.0105 |
52
+ | No log | 2.5 | 100 | 0.8924 | 0.0151 | 0.0105 |
53
+ | No log | 5.0 | 200 | 0.8481 | 0.0156 | 0.0130 |
54
+ | No log | 5.0 | 200 | 0.8481 | 0.0156 | 0.0130 |
55
+ | No log | 7.5 | 300 | 0.8892 | 0.0137 | 0.0103 |
56
+ | No log | 7.5 | 300 | 0.8892 | 0.0137 | 0.0103 |
57
+ | No log | 10.0 | 400 | 0.9051 | 0.0130 | 0.0089 |
58
+ | No log | 10.0 | 400 | 0.9051 | 0.0130 | 0.0089 |
59
+ | No log | 12.5 | 500 | 0.9051 | 0.0117 | 0.0088 |
60
+ | No log | 12.5 | 500 | 0.9051 | 0.0117 | 0.0088 |
61
+ | No log | 15.0 | 600 | 0.8956 | 0.0127 | 0.0117 |
62
+ | No log | 15.0 | 600 | 0.8956 | 0.0127 | 0.0117 |
63
+ | No log | 17.5 | 700 | 0.9241 | 0.0100 | 0.0082 |
64
+ | No log | 17.5 | 700 | 0.9241 | 0.0100 | 0.0082 |
65
+ | No log | 20.0 | 800 | 0.9367 | 0.0103 | 0.0073 |
66
+ | No log | 20.0 | 800 | 0.9367 | 0.0103 | 0.0073 |
67
+ | No log | 22.5 | 900 | 0.9209 | 0.0106 | 0.0079 |
68
+ | No log | 22.5 | 900 | 0.9209 | 0.0106 | 0.0079 |
69
+ | No log | 25.0 | 1000 | 0.9367 | 0.0095 | 0.0067 |
70
+ | No log | 25.0 | 1000 | 0.9367 | 0.0095 | 0.0067 |
71
+ | No log | 25.0 | 1000 | 0.2138 | 0.9367 |
72
+ | No log | 27.5 | 1100 | 0.9494 | 0.0081 | 0.0069 |
73
+ | No log | 27.5 | 1100 | 0.9494 | 0.0081 | 0.0069 |
74
+ | No log | 30.0 | 1200 | 0.9494 | 0.0089 | 0.0077 |
75
+ | No log | 30.0 | 1200 | 0.9494 | 0.0089 | 0.0077 |
76
+ | No log | 32.5 | 1300 | 0.9430 | 0.0078 | 0.0074 |
77
+ | No log | 32.5 | 1300 | 0.9430 | 0.0078 | 0.0074 |
78
+ | No log | 35.0 | 1400 | 0.9462 | 0.0087 | 0.0086 |
79
+ | No log | 35.0 | 1400 | 0.9462 | 0.0087 | 0.0086 |
80
+ | No log | 37.5 | 1500 | 0.9430 | 0.0080 | 0.0084 |
81
+ | No log | 37.5 | 1500 | 0.9430 | 0.0080 | 0.0084 |
82
+ | No log | 40.0 | 1600 | 0.9494 | 0.0082 | 0.0070 |
83
+ | No log | 40.0 | 1600 | 0.9494 | 0.0082 | 0.0070 |
84
+ | No log | 42.5 | 1700 | 0.9399 | 0.0076 | 0.0091 |
85
+ | No log | 42.5 | 1700 | 0.9399 | 0.0076 | 0.0091 |
86
+ | No log | 45.0 | 1800 | 0.9462 | 0.0077 | 0.0095 |
87
+ | No log | 45.0 | 1800 | 0.9462 | 0.0077 | 0.0095 |
88
+ | No log | 47.5 | 1900 | 0.9525 | 0.0065 | 0.0080 |
89
+ | No log | 47.5 | 1900 | 0.9525 | 0.0065 | 0.0080 |
90
+ | No log | 50.0 | 2000 | 0.9525 | 0.0060 | 0.0079 |
91
+ | No log | 50.0 | 2000 | 0.9525 | 0.0060 | 0.0079 |
92
+ | No log | 50.0 | 2000 | 0.2529 | 0.9525 |
93
 
94
 
95
  ### Framework versions