Edit model card

final-algae-swin-wirs

This model is a fine-tuned version of samitizerxu/final-algae-swin-wirs on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7645
  • Accuracy: 0.6725

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.3024 1.0 120 0.7645 0.6725
1.2375 2.0 240 0.8013 0.6673
1.1875 3.0 360 0.8251 0.6649
1.187 4.0 480 0.8960 0.6403
1.1472 5.0 600 0.9558 0.6244
1.1374 6.0 720 1.1951 0.4877
1.1114 7.0 840 1.1109 0.5358
1.1201 8.0 960 0.9724 0.6227
1.0801 9.0 1080 0.9913 0.5863
1.0995 10.0 1200 1.0117 0.5933
1.0817 11.0 1320 1.0239 0.5951
1.0679 12.0 1440 1.0381 0.5839
1.094 13.0 1560 1.0480 0.5910
1.0325 14.0 1680 1.0671 0.5839
1.0087 15.0 1800 1.0133 0.5892
1.0525 16.0 1920 1.0332 0.5775
1.0614 17.0 2040 1.0085 0.5939
1.0065 18.0 2160 1.0070 0.5974
1.0474 19.0 2280 1.0023 0.5898
1.0346 20.0 2400 1.0072 0.5839
1.0226 21.0 2520 1.0219 0.5792
1.0474 22.0 2640 1.0106 0.5880
0.983 23.0 2760 1.0020 0.5874
0.9997 24.0 2880 1.0838 0.5593
1.0074 25.0 3000 1.0781 0.5593
1.0 26.0 3120 1.0378 0.5751
1.0279 27.0 3240 1.0737 0.5604
0.9696 28.0 3360 1.1385 0.5123
0.9862 29.0 3480 1.1236 0.5282
1.0155 30.0 3600 1.0415 0.5798
0.9723 31.0 3720 1.1447 0.5258
0.9935 32.0 3840 1.1166 0.5323
0.9965 33.0 3960 1.0502 0.5716
0.9645 34.0 4080 1.1316 0.5329
0.9771 35.0 4200 1.1860 0.5170
0.9976 36.0 4320 1.2937 0.4906
0.9207 37.0 4440 1.2272 0.5135
0.9813 38.0 4560 1.2067 0.5258
0.9337 39.0 4680 1.2162 0.5282
0.9628 40.0 4800 1.2700 0.5059
0.9561 41.0 4920 1.2428 0.5094
0.9208 42.0 5040 1.2271 0.5158
0.9097 43.0 5160 1.2388 0.5182
0.9487 44.0 5280 1.1966 0.5264
0.9386 45.0 5400 1.2107 0.5258
0.9291 46.0 5520 1.2893 0.4977
0.9357 47.0 5640 1.2764 0.5041
0.9064 48.0 5760 1.2710 0.5012
0.9032 49.0 5880 1.2695 0.5
0.9423 50.0 6000 1.2703 0.4982

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
63
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.