sharren commited on
Commit
c831ad4
1 Parent(s): 895e2b7

Model save

Browse files
README.md ADDED
@@ -0,0 +1,84 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google/vit-base-patch16-224
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ - precision
9
+ - recall
10
+ - f1
11
+ model-index:
12
+ - name: vit-lr-cosine-restarts
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # vit-lr-cosine-restarts
20
+
21
+ This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.8289
24
+ - Accuracy: 0.8405
25
+ - Precision: 0.8428
26
+ - Recall: 0.8405
27
+ - F1: 0.8367
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 0.0001
47
+ - train_batch_size: 16
48
+ - eval_batch_size: 8
49
+ - seed: 42
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: cosine_with_restarts
52
+ - num_epochs: 100
53
+ - mixed_precision_training: Native AMP
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
59
+ | 0.5734 | 0.31 | 100 | 0.6008 | 0.7885 | 0.7631 | 0.7885 | 0.7655 |
60
+ | 0.5602 | 0.62 | 200 | 0.7843 | 0.7542 | 0.7425 | 0.7542 | 0.7004 |
61
+ | 0.7117 | 0.93 | 300 | 0.6222 | 0.7660 | 0.8158 | 0.7660 | 0.7754 |
62
+ | 0.4445 | 1.25 | 400 | 0.5481 | 0.7923 | 0.8181 | 0.7923 | 0.7999 |
63
+ | 0.3471 | 1.56 | 500 | 0.5285 | 0.8218 | 0.8158 | 0.8218 | 0.8048 |
64
+ | 0.3144 | 1.87 | 600 | 0.5565 | 0.7961 | 0.8312 | 0.7961 | 0.8023 |
65
+ | 0.1702 | 2.18 | 700 | 0.5404 | 0.8256 | 0.8320 | 0.8256 | 0.8240 |
66
+ | 0.2557 | 2.49 | 800 | 0.5153 | 0.8402 | 0.8327 | 0.8402 | 0.8301 |
67
+ | 0.1579 | 2.8 | 900 | 0.5867 | 0.8218 | 0.8420 | 0.8218 | 0.8250 |
68
+ | 0.0815 | 3.12 | 1000 | 0.6218 | 0.8402 | 0.8476 | 0.8402 | 0.8351 |
69
+ | 0.1075 | 3.43 | 1100 | 0.6123 | 0.8429 | 0.8456 | 0.8429 | 0.8342 |
70
+ | 0.161 | 3.74 | 1200 | 0.6439 | 0.8509 | 0.8478 | 0.8509 | 0.8419 |
71
+ | 0.0446 | 4.05 | 1300 | 0.6347 | 0.8561 | 0.8515 | 0.8561 | 0.8516 |
72
+ | 0.1209 | 4.36 | 1400 | 0.6838 | 0.8454 | 0.8482 | 0.8454 | 0.8454 |
73
+ | 0.006 | 4.67 | 1500 | 0.7756 | 0.8395 | 0.8375 | 0.8395 | 0.8363 |
74
+ | 0.0219 | 4.98 | 1600 | 0.8815 | 0.8280 | 0.8368 | 0.8280 | 0.8271 |
75
+ | 0.0616 | 5.3 | 1700 | 1.0825 | 0.8155 | 0.8128 | 0.8155 | 0.7864 |
76
+ | 0.0305 | 5.61 | 1800 | 0.8289 | 0.8405 | 0.8428 | 0.8405 | 0.8367 |
77
+
78
+
79
+ ### Framework versions
80
+
81
+ - Transformers 4.38.2
82
+ - Pytorch 2.2.1+cu121
83
+ - Datasets 2.18.0
84
+ - Tokenizers 0.15.2
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:72acec732c47966b68d2faa14d9105206e4d5aa92ce96624703d728c012b6fce
3
  size 343239356
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b63414d64d58868f9f12778c6f4f12d4cc8636a13e329ba9be76845535c66c90
3
  size 343239356
runs/Mar18_15-27-23_9c311a5b3773/events.out.tfevents.1710775645.9c311a5b3773.3314.20 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ac14bb7fb877136223ec3877943bc85dcc71c156a49992d5b826d9e9a2417ad7
3
- size 51119
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:289d22891a8b2006aba63d4021afc9c002ffc80289f8e009b7e9faa632abc207
3
+ size 51473