ashishp-wiai
commited on
Commit
•
959537a
1
Parent(s):
4ae1ae6
Model save
Browse files- README.md +258 -0
- pytorch_model.bin +1 -1
README.md
ADDED
@@ -0,0 +1,258 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model: google/vit-base-patch16-224-in21k
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
metrics:
|
7 |
+
- accuracy
|
8 |
+
model-index:
|
9 |
+
- name: vit-base-patch16-224-in21k-finetune-os300_norm
|
10 |
+
results: []
|
11 |
+
---
|
12 |
+
|
13 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
14 |
+
should probably proofread and complete it, then remove this comment. -->
|
15 |
+
|
16 |
+
# vit-base-patch16-224-in21k-finetune-os300_norm
|
17 |
+
|
18 |
+
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
|
19 |
+
It achieves the following results on the evaluation set:
|
20 |
+
- Loss: 0.3499
|
21 |
+
- Accuracy: 0.8577
|
22 |
+
|
23 |
+
## Model description
|
24 |
+
|
25 |
+
More information needed
|
26 |
+
|
27 |
+
## Intended uses & limitations
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Training and evaluation data
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training procedure
|
36 |
+
|
37 |
+
### Training hyperparameters
|
38 |
+
|
39 |
+
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 0.005
|
41 |
+
- train_batch_size: 128
|
42 |
+
- eval_batch_size: 128
|
43 |
+
- seed: 42
|
44 |
+
- gradient_accumulation_steps: 4
|
45 |
+
- total_train_batch_size: 512
|
46 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
47 |
+
- lr_scheduler_type: linear
|
48 |
+
- num_epochs: 200
|
49 |
+
- mixed_precision_training: Native AMP
|
50 |
+
|
51 |
+
### Training results
|
52 |
+
|
53 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
54 |
+
|:-------------:|:------:|:----:|:---------------:|:--------:|
|
55 |
+
| 1.038 | 0.98 | 11 | 0.7215 | 0.6568 |
|
56 |
+
| 0.7212 | 1.96 | 22 | 0.7280 | 0.6568 |
|
57 |
+
| 0.7201 | 2.93 | 33 | 0.7285 | 0.6568 |
|
58 |
+
| 0.7308 | 4.0 | 45 | 0.7297 | 0.6568 |
|
59 |
+
| 0.7341 | 4.98 | 56 | 0.7277 | 0.6568 |
|
60 |
+
| 0.7255 | 5.96 | 67 | 0.7350 | 0.6568 |
|
61 |
+
| 0.7274 | 6.93 | 78 | 0.7258 | 0.6568 |
|
62 |
+
| 0.7189 | 8.0 | 90 | 0.7205 | 0.6568 |
|
63 |
+
| 0.7194 | 8.98 | 101 | 0.7117 | 0.6568 |
|
64 |
+
| 0.7437 | 9.96 | 112 | 0.7340 | 0.6568 |
|
65 |
+
| 0.7578 | 10.93 | 123 | 0.7317 | 0.6568 |
|
66 |
+
| 0.7307 | 12.0 | 135 | 0.7288 | 0.6568 |
|
67 |
+
| 0.7279 | 12.98 | 146 | 0.7246 | 0.6568 |
|
68 |
+
| 0.727 | 13.96 | 157 | 0.7166 | 0.6568 |
|
69 |
+
| 0.7161 | 14.93 | 168 | 0.7306 | 0.5117 |
|
70 |
+
| 0.6775 | 16.0 | 180 | 0.6360 | 0.6568 |
|
71 |
+
| 0.6487 | 16.98 | 191 | 0.6166 | 0.7113 |
|
72 |
+
| 0.607 | 17.96 | 202 | 0.5871 | 0.7240 |
|
73 |
+
| 0.5961 | 18.93 | 213 | 0.5606 | 0.7183 |
|
74 |
+
| 0.5681 | 20.0 | 225 | 0.5459 | 0.7381 |
|
75 |
+
| 0.5756 | 20.98 | 236 | 0.5375 | 0.7481 |
|
76 |
+
| 0.5666 | 21.96 | 247 | 0.5720 | 0.7042 |
|
77 |
+
| 0.5658 | 22.93 | 258 | 0.5127 | 0.7481 |
|
78 |
+
| 0.5461 | 24.0 | 270 | 0.5254 | 0.7360 |
|
79 |
+
| 0.5484 | 24.98 | 281 | 0.5124 | 0.7431 |
|
80 |
+
| 0.5442 | 25.96 | 292 | 0.5665 | 0.7282 |
|
81 |
+
| 0.5573 | 26.93 | 303 | 0.5019 | 0.7594 |
|
82 |
+
| 0.535 | 28.0 | 315 | 0.5112 | 0.7792 |
|
83 |
+
| 0.5319 | 28.98 | 326 | 0.4729 | 0.7856 |
|
84 |
+
| 0.4953 | 29.96 | 337 | 0.6292 | 0.7318 |
|
85 |
+
| 0.5408 | 30.93 | 348 | 0.5083 | 0.7877 |
|
86 |
+
| 0.5215 | 32.0 | 360 | 0.5131 | 0.7799 |
|
87 |
+
| 0.5291 | 32.98 | 371 | 0.4867 | 0.7983 |
|
88 |
+
| 0.4971 | 33.96 | 382 | 0.4742 | 0.7962 |
|
89 |
+
| 0.5004 | 34.93 | 393 | 0.4930 | 0.7806 |
|
90 |
+
| 0.4868 | 36.0 | 405 | 0.4550 | 0.8061 |
|
91 |
+
| 0.4784 | 36.98 | 416 | 0.4667 | 0.7912 |
|
92 |
+
| 0.469 | 37.96 | 427 | 0.4915 | 0.7856 |
|
93 |
+
| 0.455 | 38.93 | 438 | 0.5016 | 0.7537 |
|
94 |
+
| 0.4903 | 40.0 | 450 | 0.4874 | 0.7877 |
|
95 |
+
| 0.4904 | 40.98 | 461 | 0.5222 | 0.7629 |
|
96 |
+
| 0.513 | 41.96 | 472 | 0.4772 | 0.7877 |
|
97 |
+
| 0.4913 | 42.93 | 483 | 0.5386 | 0.7629 |
|
98 |
+
| 0.5216 | 44.0 | 495 | 0.4830 | 0.7827 |
|
99 |
+
| 0.4931 | 44.98 | 506 | 0.4692 | 0.7948 |
|
100 |
+
| 0.4835 | 45.96 | 517 | 0.4941 | 0.7757 |
|
101 |
+
| 0.5035 | 46.93 | 528 | 0.4716 | 0.7884 |
|
102 |
+
| 0.5068 | 48.0 | 540 | 0.5210 | 0.7841 |
|
103 |
+
| 0.5207 | 48.98 | 551 | 0.4656 | 0.8132 |
|
104 |
+
| 0.4753 | 49.96 | 562 | 0.4529 | 0.8025 |
|
105 |
+
| 0.4718 | 50.93 | 573 | 0.4403 | 0.8075 |
|
106 |
+
| 0.4757 | 52.0 | 585 | 0.4305 | 0.8132 |
|
107 |
+
| 0.4352 | 52.98 | 596 | 0.4104 | 0.8245 |
|
108 |
+
| 0.4349 | 53.96 | 607 | 0.4390 | 0.8125 |
|
109 |
+
| 0.4508 | 54.93 | 618 | 0.4409 | 0.8011 |
|
110 |
+
| 0.4596 | 56.0 | 630 | 0.4131 | 0.8323 |
|
111 |
+
| 0.4321 | 56.98 | 641 | 0.4257 | 0.8188 |
|
112 |
+
| 0.4433 | 57.96 | 652 | 0.4421 | 0.7969 |
|
113 |
+
| 0.4423 | 58.93 | 663 | 0.4430 | 0.7990 |
|
114 |
+
| 0.446 | 60.0 | 675 | 0.4328 | 0.8181 |
|
115 |
+
| 0.425 | 60.98 | 686 | 0.4385 | 0.8011 |
|
116 |
+
| 0.4363 | 61.96 | 697 | 0.4225 | 0.8139 |
|
117 |
+
| 0.4358 | 62.93 | 708 | 0.4114 | 0.8224 |
|
118 |
+
| 0.415 | 64.0 | 720 | 0.4110 | 0.8174 |
|
119 |
+
| 0.423 | 64.98 | 731 | 0.4090 | 0.8238 |
|
120 |
+
| 0.4161 | 65.96 | 742 | 0.4011 | 0.8160 |
|
121 |
+
| 0.4103 | 66.93 | 753 | 0.4207 | 0.8188 |
|
122 |
+
| 0.4254 | 68.0 | 765 | 0.4503 | 0.8004 |
|
123 |
+
| 0.429 | 68.98 | 776 | 0.4392 | 0.8033 |
|
124 |
+
| 0.4341 | 69.96 | 787 | 0.4159 | 0.8209 |
|
125 |
+
| 0.4574 | 70.93 | 798 | 0.4165 | 0.8224 |
|
126 |
+
| 0.4136 | 72.0 | 810 | 0.3954 | 0.8337 |
|
127 |
+
| 0.4226 | 72.98 | 821 | 0.3996 | 0.8301 |
|
128 |
+
| 0.4124 | 73.96 | 832 | 0.4205 | 0.8089 |
|
129 |
+
| 0.4209 | 74.93 | 843 | 0.4288 | 0.8146 |
|
130 |
+
| 0.4493 | 76.0 | 855 | 0.4193 | 0.8167 |
|
131 |
+
| 0.4302 | 76.98 | 866 | 0.4239 | 0.8132 |
|
132 |
+
| 0.4385 | 77.96 | 877 | 0.4187 | 0.8160 |
|
133 |
+
| 0.4388 | 78.93 | 888 | 0.4379 | 0.8047 |
|
134 |
+
| 0.4294 | 80.0 | 900 | 0.4048 | 0.8309 |
|
135 |
+
| 0.4207 | 80.98 | 911 | 0.4287 | 0.8139 |
|
136 |
+
| 0.4316 | 81.96 | 922 | 0.4183 | 0.8202 |
|
137 |
+
| 0.4283 | 82.93 | 933 | 0.4091 | 0.8224 |
|
138 |
+
| 0.4227 | 84.0 | 945 | 0.4070 | 0.8231 |
|
139 |
+
| 0.4335 | 84.98 | 956 | 0.4184 | 0.8224 |
|
140 |
+
| 0.4433 | 85.96 | 967 | 0.4148 | 0.8132 |
|
141 |
+
| 0.4287 | 86.93 | 978 | 0.4188 | 0.8167 |
|
142 |
+
| 0.4327 | 88.0 | 990 | 0.4091 | 0.8224 |
|
143 |
+
| 0.427 | 88.98 | 1001 | 0.4118 | 0.8202 |
|
144 |
+
| 0.4194 | 89.96 | 1012 | 0.4220 | 0.8153 |
|
145 |
+
| 0.4213 | 90.93 | 1023 | 0.4195 | 0.8096 |
|
146 |
+
| 0.4288 | 92.0 | 1035 | 0.4023 | 0.8188 |
|
147 |
+
| 0.4123 | 92.98 | 1046 | 0.4005 | 0.8393 |
|
148 |
+
| 0.4172 | 93.96 | 1057 | 0.3812 | 0.8309 |
|
149 |
+
| 0.4109 | 94.93 | 1068 | 0.3838 | 0.8294 |
|
150 |
+
| 0.4128 | 96.0 | 1080 | 0.3878 | 0.8294 |
|
151 |
+
| 0.3976 | 96.98 | 1091 | 0.4023 | 0.8259 |
|
152 |
+
| 0.4097 | 97.96 | 1102 | 0.3979 | 0.8153 |
|
153 |
+
| 0.4059 | 98.93 | 1113 | 0.3953 | 0.8294 |
|
154 |
+
| 0.4011 | 100.0 | 1125 | 0.3804 | 0.8344 |
|
155 |
+
| 0.4126 | 100.98 | 1136 | 0.3915 | 0.8259 |
|
156 |
+
| 0.425 | 101.96 | 1147 | 0.4140 | 0.8160 |
|
157 |
+
| 0.4066 | 102.93 | 1158 | 0.4207 | 0.8238 |
|
158 |
+
| 0.4265 | 104.0 | 1170 | 0.4016 | 0.8259 |
|
159 |
+
| 0.4225 | 104.98 | 1181 | 0.4059 | 0.8252 |
|
160 |
+
| 0.4201 | 105.96 | 1192 | 0.3980 | 0.8309 |
|
161 |
+
| 0.408 | 106.93 | 1203 | 0.4171 | 0.8202 |
|
162 |
+
| 0.422 | 108.0 | 1215 | 0.4475 | 0.8096 |
|
163 |
+
| 0.4251 | 108.98 | 1226 | 0.4139 | 0.8224 |
|
164 |
+
| 0.4261 | 109.96 | 1237 | 0.4113 | 0.8167 |
|
165 |
+
| 0.4147 | 110.93 | 1248 | 0.4355 | 0.8089 |
|
166 |
+
| 0.4407 | 112.0 | 1260 | 0.4453 | 0.8146 |
|
167 |
+
| 0.4167 | 112.98 | 1271 | 0.3987 | 0.8372 |
|
168 |
+
| 0.4152 | 113.96 | 1282 | 0.4008 | 0.8273 |
|
169 |
+
| 0.3952 | 114.93 | 1293 | 0.3843 | 0.8351 |
|
170 |
+
| 0.4159 | 116.0 | 1305 | 0.3949 | 0.8330 |
|
171 |
+
| 0.4014 | 116.98 | 1316 | 0.4113 | 0.8040 |
|
172 |
+
| 0.4203 | 117.96 | 1327 | 0.3988 | 0.8309 |
|
173 |
+
| 0.4159 | 118.93 | 1338 | 0.4037 | 0.8351 |
|
174 |
+
| 0.4065 | 120.0 | 1350 | 0.3847 | 0.8393 |
|
175 |
+
| 0.3938 | 120.98 | 1361 | 0.4023 | 0.8280 |
|
176 |
+
| 0.4202 | 121.96 | 1372 | 0.4015 | 0.8301 |
|
177 |
+
| 0.4316 | 122.93 | 1383 | 0.4156 | 0.8174 |
|
178 |
+
| 0.416 | 124.0 | 1395 | 0.3924 | 0.8344 |
|
179 |
+
| 0.4141 | 124.98 | 1406 | 0.3839 | 0.8358 |
|
180 |
+
| 0.4157 | 125.96 | 1417 | 0.3940 | 0.8224 |
|
181 |
+
| 0.3906 | 126.93 | 1428 | 0.3826 | 0.8287 |
|
182 |
+
| 0.4051 | 128.0 | 1440 | 0.3807 | 0.8316 |
|
183 |
+
| 0.3835 | 128.98 | 1451 | 0.3866 | 0.8386 |
|
184 |
+
| 0.3976 | 129.96 | 1462 | 0.3832 | 0.8457 |
|
185 |
+
| 0.3939 | 130.93 | 1473 | 0.3745 | 0.8351 |
|
186 |
+
| 0.3862 | 132.0 | 1485 | 0.3897 | 0.8408 |
|
187 |
+
| 0.3919 | 132.98 | 1496 | 0.3841 | 0.8429 |
|
188 |
+
| 0.3928 | 133.96 | 1507 | 0.3744 | 0.8507 |
|
189 |
+
| 0.3976 | 134.93 | 1518 | 0.3610 | 0.8535 |
|
190 |
+
| 0.3834 | 136.0 | 1530 | 0.3711 | 0.8422 |
|
191 |
+
| 0.3827 | 136.98 | 1541 | 0.3860 | 0.8422 |
|
192 |
+
| 0.4036 | 137.96 | 1552 | 0.3973 | 0.8301 |
|
193 |
+
| 0.3862 | 138.93 | 1563 | 0.3720 | 0.8429 |
|
194 |
+
| 0.3876 | 140.0 | 1575 | 0.3701 | 0.8478 |
|
195 |
+
| 0.3941 | 140.98 | 1586 | 0.3579 | 0.8500 |
|
196 |
+
| 0.3692 | 141.96 | 1597 | 0.3609 | 0.8521 |
|
197 |
+
| 0.3791 | 142.93 | 1608 | 0.3666 | 0.8493 |
|
198 |
+
| 0.3774 | 144.0 | 1620 | 0.3601 | 0.8521 |
|
199 |
+
| 0.3708 | 144.98 | 1631 | 0.3592 | 0.8549 |
|
200 |
+
| 0.3943 | 145.96 | 1642 | 0.3593 | 0.8493 |
|
201 |
+
| 0.3856 | 146.93 | 1653 | 0.3686 | 0.8429 |
|
202 |
+
| 0.381 | 148.0 | 1665 | 0.3755 | 0.8429 |
|
203 |
+
| 0.3965 | 148.98 | 1676 | 0.3698 | 0.8471 |
|
204 |
+
| 0.3862 | 149.96 | 1687 | 0.3641 | 0.8485 |
|
205 |
+
| 0.3825 | 150.93 | 1698 | 0.3652 | 0.8528 |
|
206 |
+
| 0.3751 | 152.0 | 1710 | 0.3672 | 0.8422 |
|
207 |
+
| 0.3812 | 152.98 | 1721 | 0.3626 | 0.8507 |
|
208 |
+
| 0.3805 | 153.96 | 1732 | 0.3615 | 0.8493 |
|
209 |
+
| 0.3755 | 154.93 | 1743 | 0.3678 | 0.8500 |
|
210 |
+
| 0.3802 | 156.0 | 1755 | 0.3682 | 0.8478 |
|
211 |
+
| 0.3781 | 156.98 | 1766 | 0.3802 | 0.8485 |
|
212 |
+
| 0.3845 | 157.96 | 1777 | 0.3753 | 0.8507 |
|
213 |
+
| 0.3893 | 158.93 | 1788 | 0.3694 | 0.8485 |
|
214 |
+
| 0.3676 | 160.0 | 1800 | 0.3652 | 0.8493 |
|
215 |
+
| 0.4114 | 160.98 | 1811 | 0.4020 | 0.8309 |
|
216 |
+
| 0.39 | 161.96 | 1822 | 0.3615 | 0.8528 |
|
217 |
+
| 0.3831 | 162.93 | 1833 | 0.3570 | 0.8535 |
|
218 |
+
| 0.3651 | 164.0 | 1845 | 0.3642 | 0.8401 |
|
219 |
+
| 0.3662 | 164.98 | 1856 | 0.3557 | 0.8577 |
|
220 |
+
| 0.3878 | 165.96 | 1867 | 0.3650 | 0.8457 |
|
221 |
+
| 0.376 | 166.93 | 1878 | 0.3601 | 0.8500 |
|
222 |
+
| 0.3724 | 168.0 | 1890 | 0.3617 | 0.8570 |
|
223 |
+
| 0.3661 | 168.98 | 1901 | 0.3677 | 0.8535 |
|
224 |
+
| 0.3869 | 169.96 | 1912 | 0.3617 | 0.8500 |
|
225 |
+
| 0.3717 | 170.93 | 1923 | 0.3594 | 0.8436 |
|
226 |
+
| 0.3698 | 172.0 | 1935 | 0.3632 | 0.8514 |
|
227 |
+
| 0.3761 | 172.98 | 1946 | 0.3614 | 0.8471 |
|
228 |
+
| 0.3847 | 173.96 | 1957 | 0.3566 | 0.8535 |
|
229 |
+
| 0.3716 | 174.93 | 1968 | 0.3570 | 0.8528 |
|
230 |
+
| 0.3695 | 176.0 | 1980 | 0.3557 | 0.8556 |
|
231 |
+
| 0.3702 | 176.98 | 1991 | 0.3544 | 0.8556 |
|
232 |
+
| 0.372 | 177.96 | 2002 | 0.3522 | 0.8542 |
|
233 |
+
| 0.3648 | 178.93 | 2013 | 0.3562 | 0.8493 |
|
234 |
+
| 0.3744 | 180.0 | 2025 | 0.3577 | 0.8507 |
|
235 |
+
| 0.3546 | 180.98 | 2036 | 0.3524 | 0.8535 |
|
236 |
+
| 0.3613 | 181.96 | 2047 | 0.3478 | 0.8528 |
|
237 |
+
| 0.3581 | 182.93 | 2058 | 0.3534 | 0.8549 |
|
238 |
+
| 0.3709 | 184.0 | 2070 | 0.3637 | 0.8521 |
|
239 |
+
| 0.3699 | 184.98 | 2081 | 0.3544 | 0.8549 |
|
240 |
+
| 0.3701 | 185.96 | 2092 | 0.3506 | 0.8613 |
|
241 |
+
| 0.3634 | 186.93 | 2103 | 0.3559 | 0.8592 |
|
242 |
+
| 0.3668 | 188.0 | 2115 | 0.3510 | 0.8585 |
|
243 |
+
| 0.3629 | 188.98 | 2126 | 0.3485 | 0.8592 |
|
244 |
+
| 0.3544 | 189.96 | 2137 | 0.3478 | 0.8627 |
|
245 |
+
| 0.3714 | 190.93 | 2148 | 0.3512 | 0.8592 |
|
246 |
+
| 0.3681 | 192.0 | 2160 | 0.3522 | 0.8592 |
|
247 |
+
| 0.3466 | 192.98 | 2171 | 0.3523 | 0.8570 |
|
248 |
+
| 0.3727 | 193.96 | 2182 | 0.3504 | 0.8606 |
|
249 |
+
| 0.3564 | 194.93 | 2193 | 0.3501 | 0.8577 |
|
250 |
+
| 0.3616 | 195.56 | 2200 | 0.3499 | 0.8577 |
|
251 |
+
|
252 |
+
|
253 |
+
### Framework versions
|
254 |
+
|
255 |
+
- Transformers 4.39.0
|
256 |
+
- Pytorch 2.2.1+cu121
|
257 |
+
- Datasets 2.18.0
|
258 |
+
- Tokenizers 0.15.2
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 343272234
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:235020151948405696e1986faeeea010f18ba6bc22f15703fa804c698965d055
|
3 |
size 343272234
|