vit-base_rvl-cdip
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5535
- Accuracy: 0.897
- Brier Loss: 0.1768
- Nll: 1.0978
- F1 Micro: 0.897
- F1 Macro: 0.8972
- Ece: 0.0801
- Aurc: 0.0180
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
---|---|---|---|---|---|---|---|---|---|---|
0.676 | 1.0 | 5000 | 0.6451 | 0.8230 | 0.2574 | 1.2627 | 0.8230 | 0.8237 | 0.0458 | 0.0425 |
0.4207 | 2.0 | 10000 | 0.4251 | 0.8766 | 0.1800 | 1.2821 | 0.8766 | 0.8779 | 0.0154 | 0.0218 |
0.3335 | 3.0 | 15000 | 0.3914 | 0.8861 | 0.1676 | 1.2589 | 0.8861 | 0.8858 | 0.0252 | 0.0192 |
0.2447 | 4.0 | 20000 | 0.3687 | 0.8934 | 0.1574 | 1.2243 | 0.8934 | 0.8937 | 0.0331 | 0.0164 |
0.1623 | 5.0 | 25000 | 0.3843 | 0.8976 | 0.1583 | 1.1553 | 0.8976 | 0.8973 | 0.0461 | 0.0159 |
0.1083 | 6.0 | 30000 | 0.4131 | 0.8964 | 0.1624 | 1.1514 | 0.8964 | 0.8967 | 0.0581 | 0.0163 |
0.0652 | 7.0 | 35000 | 0.4633 | 0.8966 | 0.1690 | 1.1300 | 0.8966 | 0.8967 | 0.0692 | 0.0169 |
0.0361 | 8.0 | 40000 | 0.5068 | 0.8976 | 0.1723 | 1.1161 | 0.8976 | 0.8976 | 0.0737 | 0.0175 |
0.0192 | 9.0 | 45000 | 0.5418 | 0.8982 | 0.1748 | 1.1015 | 0.8982 | 0.8983 | 0.0779 | 0.0179 |
0.0111 | 10.0 | 50000 | 0.5535 | 0.897 | 0.1768 | 1.0978 | 0.897 | 0.8972 | 0.0801 | 0.0180 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2
- Downloads last month
- 164
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.