Prot10's picture
End of training
142d0be
|
raw
history blame
No virus
3.22 kB
metadata
license: apache-2.0
base_model: facebook/convnextv2-base-1k-224
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: convnextv2-base-1k-224-for-pre_evaluation
    results: []

convnextv2-base-1k-224-for-pre_evaluation

This model is a fine-tuned version of facebook/convnextv2-base-1k-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4479
  • Accuracy: 0.4382

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5952 0.93 10 1.5511 0.2960
1.5238 1.95 21 1.5091 0.3427
1.4881 2.98 32 1.4854 0.3450
1.4708 4.0 43 1.4616 0.3473
1.4361 4.93 53 1.4417 0.3450
1.3764 5.95 64 1.4135 0.3753
1.3333 6.98 75 1.3822 0.3986
1.3296 8.0 86 1.4112 0.3636
1.2798 8.93 96 1.4038 0.3893
1.3129 9.95 107 1.4241 0.3776
1.3014 10.98 118 1.3570 0.3893
1.2332 12.0 129 1.4073 0.3893
1.212 12.93 139 1.3770 0.4033
1.1763 13.95 150 1.3891 0.3963
1.124 14.98 161 1.3915 0.4126
1.0963 16.0 172 1.4099 0.4149
1.0547 16.93 182 1.4206 0.4033
1.0631 17.95 193 1.4041 0.4196
0.9911 18.98 204 1.4272 0.4149
1.005 20.0 215 1.4211 0.4219
0.9663 20.93 225 1.4662 0.4009
0.9533 21.95 236 1.4286 0.4336
0.9506 22.98 247 1.4135 0.4312
0.8973 24.0 258 1.4428 0.4266
0.8807 24.93 268 1.4479 0.4382
0.8731 25.95 279 1.4429 0.4289
0.8472 26.98 290 1.4461 0.4312
0.8348 27.91 300 1.4531 0.4336

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3