document-crop / README.md
habibi26's picture
Model save
cf17af3 verified
metadata
base_model: openai/clip-vit-base-patch32
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: document-crop
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9021739130434783

document-crop

This model is a fine-tuned version of openai/clip-vit-base-patch32 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7682
  • Accuracy: 0.9022

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 250

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6337 2.6667 20 0.6879 0.5870
0.4336 5.3333 40 0.7280 0.6413
0.2493 8.0 60 0.5044 0.75
0.1756 10.6667 80 0.3750 0.8478
0.1715 13.3333 100 0.7468 0.6957
0.1525 16.0 120 0.6240 0.7935
0.2019 18.6667 140 0.3115 0.8804
0.1366 21.3333 160 0.8020 0.7391
0.1729 24.0 180 0.7651 0.7283
0.1499 26.6667 200 0.6695 0.7826
0.1226 29.3333 220 0.5607 0.8370
0.1426 32.0 240 0.5363 0.8152
0.0986 34.6667 260 0.2214 0.9022
0.0984 37.3333 280 0.2494 0.9022
0.1764 40.0 300 0.3202 0.9022
0.0712 42.6667 320 0.6895 0.8370
0.104 45.3333 340 0.8008 0.75
0.107 48.0 360 0.6523 0.8696
0.1446 50.6667 380 0.4615 0.8370
0.0525 53.3333 400 0.5936 0.9130
0.1076 56.0 420 0.5063 0.9022
0.0554 58.6667 440 0.4740 0.8913
0.0701 61.3333 460 0.4842 0.8587
0.1011 64.0 480 0.5180 0.8587
0.0471 66.6667 500 1.6979 0.7717
0.0559 69.3333 520 0.4181 0.9022
0.0371 72.0 540 0.4239 0.9022
0.0653 74.6667 560 0.2725 0.9239
0.0564 77.3333 580 0.8607 0.8043
0.0427 80.0 600 0.2848 0.9457
0.1251 82.6667 620 0.3903 0.9022
0.023 85.3333 640 0.4514 0.9239
0.0297 88.0 660 0.7634 0.8913
0.0553 90.6667 680 0.5395 0.8913
0.0147 93.3333 700 0.7752 0.8696
0.0804 96.0 720 0.6780 0.8913
0.0154 98.6667 740 0.7887 0.8587
0.0063 101.3333 760 0.5492 0.9239
0.0131 104.0 780 0.8119 0.8804
0.0113 106.6667 800 1.0839 0.8587
0.0268 109.3333 820 1.0396 0.8587
0.0215 112.0 840 0.8707 0.9022
0.0271 114.6667 860 0.5733 0.9457
0.0208 117.3333 880 0.6780 0.9130
0.0224 120.0 900 0.3565 0.9457
0.0324 122.6667 920 0.3860 0.9239
0.019 125.3333 940 0.5652 0.9022
0.0079 128.0 960 0.5316 0.9348
0.0064 130.6667 980 0.5368 0.9239
0.0055 133.3333 1000 0.8009 0.8913
0.0156 136.0 1020 0.8391 0.9348
0.04 138.6667 1040 0.6336 0.9022
0.0031 141.3333 1060 0.5656 0.9348
0.0009 144.0 1080 0.4957 0.9348
0.0004 146.6667 1100 0.9136 0.8913
0.006 149.3333 1120 0.9782 0.8913
0.0004 152.0 1140 0.9065 0.9239
0.0042 154.6667 1160 0.9944 0.9130
0.0001 157.3333 1180 0.8723 0.9239
0.0002 160.0 1200 1.1987 0.8804
0.0083 162.6667 1220 0.7118 0.9239
0.0 165.3333 1240 0.7793 0.9130
0.0 168.0 1260 0.7330 0.9239
0.0038 170.6667 1280 0.5990 0.9348
0.0001 173.3333 1300 0.6496 0.9239
0.0 176.0 1320 0.8535 0.8913
0.0 178.6667 1340 0.6108 0.9348
0.0 181.3333 1360 0.5813 0.9348
0.0 184.0 1380 0.5817 0.9239
0.0 186.6667 1400 0.5852 0.9239
0.0 189.3333 1420 0.5877 0.9239
0.0 192.0 1440 0.5941 0.9239
0.0 194.6667 1460 0.6219 0.9130
0.0 197.3333 1480 0.6350 0.9130
0.0 200.0 1500 0.6388 0.9130
0.0 202.6667 1520 0.6409 0.9130
0.0 205.3333 1540 0.6423 0.9130
0.0 208.0 1560 0.6430 0.9130
0.0 210.6667 1580 0.6336 0.9130
0.0 213.3333 1600 0.7124 0.9022
0.0 216.0 1620 0.7457 0.9022
0.0 218.6667 1640 0.7498 0.9022
0.0 221.3333 1660 0.7505 0.9022
0.0 224.0 1680 0.7512 0.9022
0.0 226.6667 1700 0.7660 0.9022
0.0 229.3333 1720 0.7682 0.9022
0.0 232.0 1740 0.7682 0.9022

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1