alz-mri-vit / README.md
spolivin's picture
Update README.md
41a1b5b
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: alz-mri-vit
    results:
      - task:
          name: image-classification
          type: image-classification
        dataset:
          name: Falah/Alzheimer_MRI
          type: Falah/Alzheimer_MRI
          config: default
          split: train
          args: default
        metrics:
          - name: f1
            type: f1
            value: 0.930865
datasets:
  - Falah/Alzheimer_MRI

alz-mri-vit

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on Falah/Alzheimer_MRI dataset (fine-tuning procedure is described here). It achieves the following results on the evaluation set:

  • Loss: 0.1875
  • F1: 0.9309

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1
1.1218 1.0 64 0.9419 0.5742
0.94 2.0 128 0.9054 0.6029
0.9123 3.0 192 0.9019 0.5262
0.8625 4.0 256 0.8465 0.6029
0.8104 5.0 320 0.7810 0.6319
0.7244 6.0 384 0.7278 0.7037
0.697 7.0 448 0.6300 0.7480
0.5865 8.0 512 0.5659 0.7662
0.5199 9.0 576 0.5445 0.7721
0.4734 10.0 640 0.6750 0.7185
0.4399 11.0 704 0.4893 0.8274
0.3817 12.0 768 0.5578 0.7844
0.3318 13.0 832 0.4699 0.8228
0.3096 14.0 896 0.4460 0.8399
0.2787 15.0 960 0.4105 0.8399
0.2517 16.0 1024 0.3488 0.8578
0.2346 17.0 1088 0.3877 0.8773
0.2286 18.0 1152 0.3420 0.8575
0.1914 19.0 1216 0.4123 0.8682
0.1844 20.0 1280 0.2894 0.8913
0.173 21.0 1344 0.3197 0.8887
0.1687 22.0 1408 0.2626 0.9075
0.1601 23.0 1472 0.2951 0.9068
0.1466 24.0 1536 0.2666 0.9049
0.1468 25.0 1600 0.2136 0.9103
0.1226 26.0 1664 0.2387 0.9127
0.1186 27.0 1728 0.2131 0.9271
0.0951 28.0 1792 0.2520 0.9130
0.1049 29.0 1856 0.2096 0.9259
0.0936 30.0 1920 0.1875 0.9309

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0