Augusto777's picture
Model save
7df4ee0 verified
|
raw
history blame
6.58 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-base-patch16-224-RU4-40
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7833333333333333

vit-base-patch16-224-RU4-40

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 49869186446092573277078519432609792.0000
  • Accuracy: 0.7833

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
62336480581735638025587599492513792.0000 0.99 19 49869186446092573277078519432609792.0000 0.4667
58440439651504818626990770153848832.0000 1.97 38 49869186446092573277078519432609792.0000 0.6
50648389482308178156834519410278400.0000 2.96 57 49869186446092573277078519432609792.0000 0.7333
56102820639337698928052607882625024.0000 4.0 77 49869186446092573277078519432609792.0000 0.75
62336476620327508623021905073405952.0000 4.99 96 49869186446092573277078519432609792.0000 0.7667
56882041501889867672610159036727296.0000 5.97 115 49869186446092573277078519432609792.0000 0.75
51817195026983603992051887699394560.0000 6.96 134 49869186446092573277078519432609792.0000 0.75
52986004533067168453206987262394368.0000 8.0 154 49869186446092573277078519432609792.0000 0.6667
55713231995806303211455059473727488.0000 8.99 173 49869186446092573277078519432609792.0000 0.6833
54934019056070393272028897157840896.0000 9.97 192 49869186446092573277078519432609792.0000 0.7833
51427598460635967917067024161832960.0000 10.96 211 49869186446092573277078519432609792.0000 0.7167
51817198988391733394617582118502400.0000 12.0 231 49869186446092573277078519432609792.0000 0.7
57661238595993278448517617385734144.0000 12.99 250 49869186446092573277078519432609792.0000 0.7333
58050851007973422910393221744951296.0000 13.97 269 49869186446092573277078519432609792.0000 0.75
56882045463297987851803816601059328.0000 14.96 288 49869186446092573277078519432609792.0000 0.7333
56102844407786447673330663832944640.0000 16.0 308 49869186446092573277078519432609792.0000 0.75
56492437012725972792493906660950016.0000 16.99 327 49869186446092573277078519432609792.0000 0.7833
51817195026983603992051887699394560.0000 17.97 346 49869186446092573277078519432609792.0000 0.7667
48310774431549178637090014703386624.0000 18.96 365 49869186446092573277078519432609792.0000 0.7667
58440451535729188387943779701620736.0000 20.0 385 49869186446092573277078519432609792.0000 0.7667
51037990010063943634385077366947840.0000 20.99 404 49869186446092573277078519432609792.0000 0.7667
52206811400371877856493186477719552.0000 21.97 423 49869186446092573277078519432609792.0000 0.7333
54934023017478513451222554722172928.0000 22.96 442 49869186446092573277078519432609792.0000 0.75
58830048102076833686300680093958144.0000 24.0 462 49869186446092573277078519432609792.0000 0.75
52986004533067168453206987262394368.0000 24.99 481 49869186446092573277078519432609792.0000 0.7667
54544418528314618571106302346395648.0000 25.97 500 49869186446092573277078519432609792.0000 0.7667
51644037916400559103047539831603200.0000 26.96 519 49869186446092573277078519432609792.0000 0.75
56102840446378327494137006268612608.0000 28.0 539 49869186446092573277078519432609792.0000 0.7667
59998853646752268744890085237850112.0000 28.99 558 49869186446092573277078519432609792.0000 0.7833
51427598460635967917067024161832960.0000 29.97 577 49869186446092573277078519432609792.0000 0.7833
53375620906455442317648286040719360.0000 30.96 596 49869186446092573277078519432609792.0000 0.7667
58830044140668713507107022529626112.0000 32.0 616 49869186446092573277078519432609792.0000 0.75
52596404005311393752284392450949120.0000 32.99 635 49869186446092573277078519432609792.0000 0.7333
57661234634585149045951922966626304.0000 33.97 654 49869186446092573277078519432609792.0000 0.7833
50648393443716298336028176974610432.0000 34.96 673 49869186446092573277078519432609792.0000 0.7667
56882045463297987851803816601059328.0000 36.0 693 49869186446092573277078519432609792.0000 0.7667
55713231995806303211455059473727488.0000 36.99 712 49869186446092573277078519432609792.0000 0.7833
51427594499227838514501329742725120.0000 37.97 731 49869186446092573277078519432609792.0000 0.7833
52596407966719523154850086870056960.0000 38.96 750 49869186446092573277078519432609792.0000 0.7833
53505475864816321566964104575320064.0000 39.48 760 49869186446092573277078519432609792.0000 0.7833

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0