TeeA commited on
Commit
ddd5ffd
1 Parent(s): 194661f

Model save

Browse files
README.md ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: TeeA/resnet-50-finetuned-pokemon
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: resnet-50-finetuned-pokemon-finetuned-pokemon
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # resnet-50-finetuned-pokemon-finetuned-pokemon
17
+
18
+ This model is a fine-tuned version of [TeeA/resnet-50-finetuned-pokemon](https://huggingface.co/TeeA/resnet-50-finetuned-pokemon) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 14.5363
21
+ - Accuracy: 0.0842
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 32
42
+ - eval_batch_size: 32
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 4
45
+ - total_train_batch_size: 128
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_ratio: 0.1
49
+ - num_epochs: 100
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
55
+ | 1.1894 | 0.99 | 38 | 9.2115 | 0.0137 |
56
+ | 1.1389 | 1.99 | 76 | 9.2521 | 0.0129 |
57
+ | 1.0432 | 2.98 | 114 | 9.4765 | 0.0144 |
58
+ | 1.0625 | 4.0 | 153 | 9.7668 | 0.0137 |
59
+ | 1.0805 | 4.99 | 191 | 10.2526 | 0.0137 |
60
+ | 1.0353 | 5.99 | 229 | 10.3238 | 0.0129 |
61
+ | 0.9747 | 6.98 | 267 | 10.5779 | 0.0165 |
62
+ | 0.9708 | 8.0 | 306 | 10.7458 | 0.0180 |
63
+ | 0.8886 | 8.99 | 344 | 11.0072 | 0.0194 |
64
+ | 0.8408 | 9.99 | 382 | 11.3171 | 0.0223 |
65
+ | 0.802 | 10.98 | 420 | 11.5545 | 0.0245 |
66
+ | 0.7903 | 12.0 | 459 | 11.7722 | 0.0288 |
67
+ | 0.7553 | 12.99 | 497 | 11.9834 | 0.0353 |
68
+ | 0.7413 | 13.99 | 535 | 11.9815 | 0.0446 |
69
+ | 0.6272 | 14.98 | 573 | 12.0871 | 0.0496 |
70
+ | 0.6944 | 16.0 | 612 | 12.3713 | 0.0590 |
71
+ | 0.6322 | 16.99 | 650 | 12.6826 | 0.0554 |
72
+ | 0.6131 | 17.99 | 688 | 12.4819 | 0.0612 |
73
+ | 0.5916 | 18.98 | 726 | 12.6246 | 0.0647 |
74
+ | 0.5094 | 20.0 | 765 | 12.6641 | 0.0669 |
75
+ | 0.5201 | 20.99 | 803 | 12.8861 | 0.0662 |
76
+ | 0.4731 | 21.99 | 841 | 12.7431 | 0.0655 |
77
+ | 0.5132 | 22.98 | 879 | 12.7786 | 0.0705 |
78
+ | 0.5036 | 24.0 | 918 | 12.9990 | 0.0727 |
79
+ | 0.4863 | 24.99 | 956 | 13.0419 | 0.0727 |
80
+ | 0.4852 | 25.99 | 994 | 13.0573 | 0.0734 |
81
+ | 0.4983 | 26.98 | 1032 | 13.1310 | 0.0719 |
82
+ | 0.459 | 28.0 | 1071 | 13.0688 | 0.0748 |
83
+ | 0.4556 | 28.99 | 1109 | 13.4128 | 0.0748 |
84
+ | 0.4729 | 29.99 | 1147 | 13.3530 | 0.0741 |
85
+ | 0.4659 | 30.98 | 1185 | 13.2308 | 0.0763 |
86
+ | 0.4337 | 32.0 | 1224 | 13.3264 | 0.0748 |
87
+ | 0.456 | 32.99 | 1262 | 13.3506 | 0.0741 |
88
+ | 0.4423 | 33.99 | 1300 | 13.3607 | 0.0784 |
89
+ | 0.4037 | 34.98 | 1338 | 13.2521 | 0.0734 |
90
+ | 0.3891 | 36.0 | 1377 | 13.3702 | 0.0777 |
91
+ | 0.3992 | 36.99 | 1415 | 13.4762 | 0.0777 |
92
+ | 0.4014 | 37.99 | 1453 | 13.5382 | 0.0791 |
93
+ | 0.3549 | 38.98 | 1491 | 13.5550 | 0.0791 |
94
+ | 0.4048 | 40.0 | 1530 | 13.6406 | 0.0799 |
95
+ | 0.3711 | 40.99 | 1568 | 13.5120 | 0.0777 |
96
+ | 0.3834 | 41.99 | 1606 | 13.9230 | 0.0799 |
97
+ | 0.3475 | 42.98 | 1644 | 13.8602 | 0.0791 |
98
+ | 0.3465 | 44.0 | 1683 | 13.6931 | 0.0806 |
99
+ | 0.3682 | 44.99 | 1721 | 13.7774 | 0.0784 |
100
+ | 0.3613 | 45.99 | 1759 | 14.0235 | 0.0791 |
101
+ | 0.368 | 46.98 | 1797 | 13.9289 | 0.0813 |
102
+ | 0.3961 | 48.0 | 1836 | 14.2549 | 0.0806 |
103
+ | 0.365 | 48.99 | 1874 | 14.1114 | 0.0813 |
104
+ | 0.3259 | 49.99 | 1912 | 13.9710 | 0.0806 |
105
+ | 0.2998 | 50.98 | 1950 | 14.0288 | 0.0806 |
106
+ | 0.3203 | 52.0 | 1989 | 13.9398 | 0.0813 |
107
+ | 0.3104 | 52.99 | 2027 | 14.0255 | 0.0820 |
108
+ | 0.3232 | 53.99 | 2065 | 13.9355 | 0.0827 |
109
+ | 0.3521 | 54.98 | 2103 | 13.8627 | 0.0806 |
110
+ | 0.3322 | 56.0 | 2142 | 14.0179 | 0.0806 |
111
+ | 0.3129 | 56.99 | 2180 | 13.9640 | 0.0820 |
112
+ | 0.3159 | 57.99 | 2218 | 14.1997 | 0.0799 |
113
+ | 0.3118 | 58.98 | 2256 | 14.1639 | 0.0820 |
114
+ | 0.3196 | 60.0 | 2295 | 14.0334 | 0.0806 |
115
+ | 0.301 | 60.99 | 2333 | 13.9954 | 0.0820 |
116
+ | 0.3142 | 61.99 | 2371 | 14.1432 | 0.0799 |
117
+ | 0.3192 | 62.98 | 2409 | 14.0269 | 0.0784 |
118
+ | 0.3342 | 64.0 | 2448 | 14.0450 | 0.0806 |
119
+ | 0.3045 | 64.99 | 2486 | 14.1746 | 0.0849 |
120
+ | 0.2991 | 65.99 | 2524 | 14.3192 | 0.0806 |
121
+ | 0.3228 | 66.98 | 2562 | 14.1782 | 0.0784 |
122
+ | 0.2711 | 68.0 | 2601 | 14.4261 | 0.0849 |
123
+ | 0.2473 | 68.99 | 2639 | 14.2303 | 0.0827 |
124
+ | 0.3287 | 69.99 | 2677 | 14.2750 | 0.0827 |
125
+ | 0.2673 | 70.98 | 2715 | 14.2303 | 0.0820 |
126
+ | 0.2843 | 72.0 | 2754 | 14.4086 | 0.0806 |
127
+ | 0.3099 | 72.99 | 2792 | 14.5184 | 0.0827 |
128
+ | 0.3102 | 73.99 | 2830 | 14.2768 | 0.0835 |
129
+ | 0.2911 | 74.98 | 2868 | 14.1010 | 0.0835 |
130
+ | 0.2927 | 76.0 | 2907 | 14.4618 | 0.0813 |
131
+ | 0.2967 | 76.99 | 2945 | 14.3581 | 0.0820 |
132
+ | 0.2446 | 77.99 | 2983 | 14.4562 | 0.0835 |
133
+ | 0.3035 | 78.98 | 3021 | 14.2681 | 0.0835 |
134
+ | 0.2989 | 80.0 | 3060 | 14.2768 | 0.0827 |
135
+ | 0.2486 | 80.99 | 3098 | 14.4242 | 0.0820 |
136
+ | 0.2622 | 81.99 | 3136 | 14.3810 | 0.0835 |
137
+ | 0.2892 | 82.98 | 3174 | 14.4637 | 0.0827 |
138
+ | 0.2668 | 84.0 | 3213 | 14.4597 | 0.0835 |
139
+ | 0.2527 | 84.99 | 3251 | 14.3098 | 0.0820 |
140
+ | 0.2636 | 85.99 | 3289 | 14.3741 | 0.0835 |
141
+ | 0.247 | 86.98 | 3327 | 14.5369 | 0.0842 |
142
+ | 0.2693 | 88.0 | 3366 | 14.4039 | 0.0835 |
143
+ | 0.2692 | 88.99 | 3404 | 14.6161 | 0.0835 |
144
+ | 0.28 | 89.99 | 3442 | 14.5244 | 0.0835 |
145
+ | 0.2535 | 90.98 | 3480 | 14.4062 | 0.0842 |
146
+ | 0.2887 | 92.0 | 3519 | 14.4113 | 0.0806 |
147
+ | 0.257 | 92.99 | 3557 | 14.3442 | 0.0842 |
148
+ | 0.2627 | 93.99 | 3595 | 14.4693 | 0.0835 |
149
+ | 0.2804 | 94.98 | 3633 | 14.3223 | 0.0835 |
150
+ | 0.2529 | 96.0 | 3672 | 14.3844 | 0.0835 |
151
+ | 0.2327 | 96.99 | 3710 | 14.4284 | 0.0835 |
152
+ | 0.2643 | 97.99 | 3748 | 14.5567 | 0.0835 |
153
+ | 0.284 | 98.98 | 3786 | 14.6738 | 0.0813 |
154
+ | 0.2503 | 99.35 | 3800 | 14.5363 | 0.0842 |
155
+
156
+
157
+ ### Framework versions
158
+
159
+ - Transformers 4.36.2
160
+ - Pytorch 2.1.0+cu121
161
+ - Datasets 2.16.1
162
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f8f75bf1a737766e7eca9790c044cef7a2e0c275c1b1346e4d7811838584d20e
3
  size 95516112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d7e723ede5bd182b8e6c4c18bd5db54730382cf0b4bd1ea18beff9b52b2e76d
3
  size 95516112
runs/Jan01_05-42-08_495197ec28a5/events.out.tfevents.1704087738.495197ec28a5.415.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e00c65b5b21bd372580f0d3c0213d589a678959ea39a5af89cb17f213c313187
3
- size 101989
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:083f23dc1a22fef7cb6cdbbf5c0c113964880d653c3a964fa046d4ddce4d91e3
3
+ size 102980