File size: 4,362 Bytes
1a106ca
 
 
 
 
 
 
 
 
 
 
 
 
 
de54407
1a106ca
de54407
 
 
 
 
 
1a106ca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
de54407
1a106ca
3ec96f7
 
de54407
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3ec96f7
 
1a106ca
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: beit-finetuned-pokemon
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# beit-finetuned-pokemon

This model is a fine-tuned version of [ydmeira/beit-finetuned-pokemon](https://huggingface.co/ydmeira/beit-finetuned-pokemon) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0222
- Mean Iou: 0.4964
- Mean Accuracy: 0.9927
- Overall Accuracy: 0.9927
- Per Category Iou: [0.0, 0.9927382211696605]
- Per Category Accuracy: [nan, 0.9927382211696605]

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou          | Per Category Accuracy     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------------:|:-------------------------:|
| 0.044         | 0.11  | 500  | 0.0430          | 0.4929   | 0.9857        | 0.9857           | [0.0, 0.9857017551704262] | [nan, 0.9857017551704262] |
| 0.0495        | 0.21  | 1000 | 0.0345          | 0.4960   | 0.9920        | 0.9920           | [0.0, 0.9920118130744071] | [nan, 0.9920118130744071] |
| 0.0382        | 0.32  | 1500 | 0.0399          | 0.4947   | 0.9894        | 0.9894           | [0.0, 0.9893992290428889] | [nan, 0.9893992290428889] |
| 0.0361        | 0.43  | 2000 | 0.0311          | 0.4963   | 0.9926        | 0.9926           | [0.0, 0.9925511589842341] | [nan, 0.9925511589842341] |
| 0.04          | 0.53  | 2500 | 0.0722          | 0.4920   | 0.9840        | 0.9840           | [0.0, 0.9839730680037156] | [nan, 0.9839730680037156] |
| 0.0308        | 0.64  | 3000 | 0.0319          | 0.4977   | 0.9954        | 0.9954           | [0.0, 0.9954462252146663] | [nan, 0.9954462252146663] |
| 0.0391        | 0.75  | 3500 | 0.1028          | 0.4837   | 0.9674        | 0.9674           | [0.0, 0.9673708120597321] | [nan, 0.9673708120597321] |
| 0.0425        | 0.85  | 4000 | 0.0330          | 0.4973   | 0.9946        | 0.9946           | [0.0, 0.9946091381677958] | [nan, 0.9946091381677958] |
| 0.0321        | 0.96  | 4500 | 0.0259          | 0.4963   | 0.9925        | 0.9925           | [0.0, 0.9925195785900393] | [nan, 0.9925195785900393] |
| 0.031         | 1.07  | 5000 | 0.0270          | 0.4965   | 0.9930        | 0.9930           | [0.0, 0.9930111407071547] | [nan, 0.9930111407071547] |
| 0.0281        | 1.17  | 5500 | 0.0367          | 0.4933   | 0.9866        | 0.9866           | [0.0, 0.9865881607581373] | [nan, 0.9865881607581373] |
| 0.0325        | 1.28  | 6000 | 0.0327          | 0.4940   | 0.9880        | 0.9880           | [0.0, 0.9879893562856097] | [nan, 0.9879893562856097] |
| 0.0253        | 1.39  | 6500 | 0.0237          | 0.4968   | 0.9937        | 0.9937           | [0.0, 0.9936538460005984] | [nan, 0.9936538460005984] |
| 0.0258        | 1.49  | 7000 | 0.0241          | 0.4964   | 0.9928        | 0.9928           | [0.0, 0.9927783017073394] | [nan, 0.9927783017073394] |
| 0.0266        | 1.6   | 7500 | 0.0234          | 0.4962   | 0.9924        | 0.9924           | [0.0, 0.9923954115635184] | [nan, 0.9923954115635184] |
| 0.0223        | 1.71  | 8000 | 0.0264          | 0.4964   | 0.9928        | 0.9928           | [0.0, 0.9928421413266322] | [nan, 0.9928421413266322] |
| 0.0212        | 1.81  | 8500 | 0.0235          | 0.4960   | 0.9920        | 0.9920           | [0.0, 0.9920402354291824] | [nan, 0.9920402354291824] |
| 0.0196        | 1.92  | 9000 | 0.0222          | 0.4964   | 0.9927        | 0.9927           | [0.0, 0.9927382211696605] | [nan, 0.9927382211696605] |


### Framework versions

- Transformers 4.21.2
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1