File size: 4,747 Bytes
8b1b74f
 
 
 
f7d9761
8b1b74f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f7d9761
8b1b74f
f7d9761
 
 
 
 
8b1b74f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
---
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- image-classification
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
- precision
model-index:
- name: vit-base-16-thesis-demo-HAM10000
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-base-16-thesis-demo-HAM10000

This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the ahishamm/HAM_db_enhanced_balanced_reduced_50_20_20_50 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5296
- Accuracy: 0.8344
- Recall: 0.8344
- F1: 0.8344
- Precision: 0.8344

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | F1     | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
| 1.4855        | 0.12  | 50   | 1.3519          | 0.5093   | 0.5093 | 0.5093 | 0.5093    |
| 1.044         | 0.23  | 100  | 1.0515          | 0.6268   | 0.6268 | 0.6268 | 0.6268    |
| 1.0774        | 0.35  | 150  | 1.2104          | 0.5681   | 0.5681 | 0.5681 | 0.5681    |
| 0.9508        | 0.46  | 200  | 1.0624          | 0.6061   | 0.6061 | 0.6061 | 0.6061    |
| 0.9522        | 0.58  | 250  | 0.9338          | 0.6449   | 0.6449 | 0.6449 | 0.6449    |
| 0.774         | 0.69  | 300  | 0.8939          | 0.6676   | 0.6676 | 0.6676 | 0.6676    |
| 0.7675        | 0.81  | 350  | 0.7742          | 0.7183   | 0.7183 | 0.7183 | 0.7183    |
| 0.7167        | 0.92  | 400  | 0.7695          | 0.7216   | 0.7216 | 0.7216 | 0.7216    |
| 0.5204        | 1.04  | 450  | 0.8005          | 0.7303   | 0.7303 | 0.7303 | 0.7303    |
| 0.456         | 1.15  | 500  | 0.8523          | 0.6903   | 0.6903 | 0.6903 | 0.6903    |
| 0.5421        | 1.27  | 550  | 0.6753          | 0.7543   | 0.7543 | 0.7543 | 0.7543    |
| 0.4446        | 1.38  | 600  | 0.6042          | 0.7810   | 0.7810 | 0.7810 | 0.7810    |
| 0.455         | 1.5   | 650  | 0.6913          | 0.7410   | 0.7410 | 0.7410 | 0.7410    |
| 0.4175        | 1.61  | 700  | 0.6142          | 0.7810   | 0.7810 | 0.7810 | 0.7810    |
| 0.3626        | 1.73  | 750  | 0.5831          | 0.8004   | 0.8004 | 0.8004 | 0.8004    |
| 0.4816        | 1.84  | 800  | 0.5586          | 0.7891   | 0.7891 | 0.7891 | 0.7891    |
| 0.3257        | 1.96  | 850  | 0.5759          | 0.7991   | 0.7991 | 0.7991 | 0.7991    |
| 0.3111        | 2.07  | 900  | 0.6100          | 0.7931   | 0.7931 | 0.7931 | 0.7931    |
| 0.2052        | 2.19  | 950  | 0.5674          | 0.8111   | 0.8111 | 0.8111 | 0.8111    |
| 0.2273        | 2.3   | 1000 | 0.5975          | 0.8017   | 0.8017 | 0.8017 | 0.8017    |
| 0.3007        | 2.42  | 1050 | 0.5714          | 0.8204   | 0.8204 | 0.8204 | 0.8204    |
| 0.2812        | 2.53  | 1100 | 0.6081          | 0.8004   | 0.8004 | 0.8004 | 0.8004    |
| 0.2661        | 2.65  | 1150 | 0.5653          | 0.8224   | 0.8224 | 0.8224 | 0.8224    |
| 0.1796        | 2.76  | 1200 | 0.5447          | 0.8338   | 0.8338 | 0.8338 | 0.8338    |
| 0.1882        | 2.88  | 1250 | 0.5357          | 0.8284   | 0.8284 | 0.8284 | 0.8284    |
| 0.1596        | 3.0   | 1300 | 0.5296          | 0.8344   | 0.8344 | 0.8344 | 0.8344    |
| 0.075         | 3.11  | 1350 | 0.5876          | 0.8198   | 0.8198 | 0.8198 | 0.8198    |
| 0.1128        | 3.23  | 1400 | 0.5612          | 0.8338   | 0.8338 | 0.8338 | 0.8338    |
| 0.0677        | 3.34  | 1450 | 0.5911          | 0.8331   | 0.8331 | 0.8331 | 0.8331    |
| 0.0794        | 3.46  | 1500 | 0.5971          | 0.8304   | 0.8304 | 0.8304 | 0.8304    |
| 0.0367        | 3.57  | 1550 | 0.5634          | 0.8378   | 0.8378 | 0.8378 | 0.8378    |
| 0.0279        | 3.69  | 1600 | 0.5674          | 0.8391   | 0.8391 | 0.8391 | 0.8391    |
| 0.0216        | 3.8   | 1650 | 0.5777          | 0.8358   | 0.8358 | 0.8358 | 0.8358    |
| 0.0161        | 3.92  | 1700 | 0.5608          | 0.8438   | 0.8438 | 0.8438 | 0.8438    |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0