square_run_second_vote_full_pic_75_age_gender
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.3432
- F1 Macro: 0.3328
- F1 Micro: 0.4697
- F1 Weighted: 0.4098
- Precision Macro: 0.3270
- Precision Micro: 0.4697
- Precision Weighted: 0.4161
- Recall Macro: 0.3849
- Recall Micro: 0.4697
- Recall Weighted: 0.4697
- Accuracy: 0.4697
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Micro | F1 Weighted | Precision Macro | Precision Micro | Precision Weighted | Recall Macro | Recall Micro | Recall Weighted | Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.874 | 1.0 | 58 | 1.8367 | 0.1624 | 0.2576 | 0.2032 | 0.1524 | 0.2576 | 0.1870 | 0.2024 | 0.2576 | 0.2576 | 0.2576 |
1.9776 | 2.0 | 116 | 1.9333 | 0.0771 | 0.1439 | 0.0881 | 0.0743 | 0.1439 | 0.0940 | 0.1402 | 0.1439 | 0.1439 | 0.1439 |
1.645 | 3.0 | 174 | 1.8058 | 0.1674 | 0.2803 | 0.1932 | 0.1766 | 0.2803 | 0.1880 | 0.2217 | 0.2803 | 0.2803 | 0.2803 |
1.6353 | 4.0 | 232 | 1.6974 | 0.2498 | 0.3636 | 0.2906 | 0.2124 | 0.3636 | 0.2464 | 0.3095 | 0.3636 | 0.3636 | 0.3636 |
1.6165 | 5.0 | 290 | 1.6144 | 0.2440 | 0.3409 | 0.2804 | 0.2940 | 0.3409 | 0.3290 | 0.2857 | 0.3409 | 0.3409 | 0.3409 |
1.7496 | 6.0 | 348 | 1.7440 | 0.2610 | 0.3636 | 0.3308 | 0.2845 | 0.3636 | 0.3528 | 0.2888 | 0.3636 | 0.3636 | 0.3636 |
1.8783 | 7.0 | 406 | 1.5126 | 0.3535 | 0.4167 | 0.3994 | 0.3637 | 0.4167 | 0.4048 | 0.3646 | 0.4167 | 0.4167 | 0.4167 |
1.2903 | 8.0 | 464 | 1.5240 | 0.3589 | 0.4167 | 0.4036 | 0.3702 | 0.4167 | 0.4107 | 0.3644 | 0.4167 | 0.4167 | 0.4167 |
1.8885 | 9.0 | 522 | 1.5423 | 0.3727 | 0.4470 | 0.4283 | 0.3800 | 0.4470 | 0.4329 | 0.3856 | 0.4470 | 0.4470 | 0.4470 |
1.0726 | 10.0 | 580 | 1.8002 | 0.3168 | 0.4015 | 0.3651 | 0.3364 | 0.4015 | 0.3788 | 0.3456 | 0.4015 | 0.4015 | 0.4015 |
1.2297 | 11.0 | 638 | 1.9532 | 0.3087 | 0.3788 | 0.3653 | 0.3752 | 0.3788 | 0.4335 | 0.3300 | 0.3788 | 0.3788 | 0.3788 |
0.7152 | 12.0 | 696 | 1.8452 | 0.3120 | 0.3864 | 0.3677 | 0.3922 | 0.3864 | 0.4313 | 0.3156 | 0.3864 | 0.3864 | 0.3864 |
0.7479 | 13.0 | 754 | 1.7619 | 0.3686 | 0.4394 | 0.4348 | 0.3981 | 0.4394 | 0.4546 | 0.3645 | 0.4394 | 0.4394 | 0.4394 |
0.2766 | 14.0 | 812 | 1.8000 | 0.3931 | 0.4924 | 0.4657 | 0.4146 | 0.4924 | 0.4792 | 0.4080 | 0.4924 | 0.4924 | 0.4924 |
0.4092 | 15.0 | 870 | 2.0428 | 0.3611 | 0.4318 | 0.4252 | 0.3772 | 0.4318 | 0.4421 | 0.3673 | 0.4318 | 0.4318 | 0.4318 |
0.1272 | 16.0 | 928 | 2.1450 | 0.3493 | 0.4242 | 0.4203 | 0.3651 | 0.4242 | 0.4419 | 0.3598 | 0.4242 | 0.4242 | 0.4242 |
0.2751 | 17.0 | 986 | 2.3002 | 0.3782 | 0.4394 | 0.4357 | 0.4548 | 0.4394 | 0.5101 | 0.3712 | 0.4394 | 0.4394 | 0.4394 |
0.3277 | 18.0 | 1044 | 2.2109 | 0.3832 | 0.4470 | 0.4450 | 0.4073 | 0.4470 | 0.4770 | 0.3856 | 0.4470 | 0.4470 | 0.4470 |
0.0134 | 19.0 | 1102 | 2.4450 | 0.3585 | 0.4470 | 0.4219 | 0.3987 | 0.4470 | 0.4533 | 0.3729 | 0.4470 | 0.4470 | 0.4470 |
0.0737 | 20.0 | 1160 | 2.5434 | 0.3468 | 0.4091 | 0.4054 | 0.3581 | 0.4091 | 0.4161 | 0.3508 | 0.4091 | 0.4091 | 0.4091 |
0.0203 | 21.0 | 1218 | 2.8118 | 0.3895 | 0.4773 | 0.4493 | 0.4176 | 0.4773 | 0.4699 | 0.4098 | 0.4773 | 0.4773 | 0.4773 |
0.0072 | 22.0 | 1276 | 2.7996 | 0.3620 | 0.4242 | 0.4165 | 0.3783 | 0.4242 | 0.4359 | 0.3729 | 0.4242 | 0.4242 | 0.4242 |
0.1251 | 23.0 | 1334 | 2.9001 | 0.4009 | 0.4394 | 0.4291 | 0.4500 | 0.4394 | 0.4703 | 0.4067 | 0.4394 | 0.4394 | 0.4394 |
0.0054 | 24.0 | 1392 | 2.8660 | 0.4011 | 0.4470 | 0.4327 | 0.4245 | 0.4470 | 0.4535 | 0.4147 | 0.4470 | 0.4470 | 0.4470 |
0.0091 | 25.0 | 1450 | 2.8868 | 0.3852 | 0.4167 | 0.4086 | 0.3965 | 0.4167 | 0.4115 | 0.3858 | 0.4167 | 0.4167 | 0.4167 |
0.002 | 26.0 | 1508 | 2.9311 | 0.3952 | 0.4394 | 0.4272 | 0.4054 | 0.4394 | 0.4343 | 0.4043 | 0.4394 | 0.4394 | 0.4394 |
0.0008 | 27.0 | 1566 | 2.9526 | 0.4052 | 0.4470 | 0.4388 | 0.4173 | 0.4470 | 0.4483 | 0.4118 | 0.4470 | 0.4470 | 0.4470 |
0.002 | 28.0 | 1624 | 3.0159 | 0.4074 | 0.4470 | 0.4389 | 0.4227 | 0.4470 | 0.4489 | 0.4116 | 0.4470 | 0.4470 | 0.4470 |
0.0017 | 29.0 | 1682 | 2.9797 | 0.4121 | 0.4545 | 0.4431 | 0.4192 | 0.4545 | 0.4464 | 0.4193 | 0.4545 | 0.4545 | 0.4545 |
0.0016 | 30.0 | 1740 | 2.9981 | 0.3677 | 0.4394 | 0.4256 | 0.3741 | 0.4394 | 0.4271 | 0.3761 | 0.4394 | 0.4394 | 0.4394 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for corranm/square_run_second_vote_full_pic_75_age_gender
Base model
google/vit-base-patch16-224