Edit model card

output_AAVESM2_650M_v1

This model is a fine-tuned version of facebook/esm2_t33_650M_UR50D on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3707
  • Accuracy: 0.8905

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 2
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • total_eval_batch_size: 2
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 36.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 390 1.3347 0.5993
1.5408 2.0 780 1.0699 0.6796
1.1283 3.0 1170 0.8751 0.7373
0.9078 4.0 1560 0.7534 0.7711
0.9078 5.0 1950 0.6711 0.8022
0.7705 6.0 2340 0.6078 0.8169
0.6863 7.0 2730 0.5668 0.8318
0.6277 8.0 3120 0.5461 0.8386
0.5863 9.0 3510 0.5143 0.8514
0.5863 10.0 3900 0.4992 0.8522
0.5564 11.0 4290 0.4940 0.8533
0.5199 12.0 4680 0.4727 0.8633
0.5025 13.0 5070 0.4586 0.8638
0.5025 14.0 5460 0.4549 0.8673
0.4814 15.0 5850 0.4442 0.8698
0.4746 16.0 6240 0.4306 0.8750
0.4527 17.0 6630 0.4291 0.8742
0.4382 18.0 7020 0.4213 0.8751
0.4382 19.0 7410 0.4193 0.8751
0.4328 20.0 7800 0.4143 0.8760
0.4191 21.0 8190 0.4071 0.8836
0.4106 22.0 8580 0.3980 0.8819
0.4106 23.0 8970 0.3987 0.8822
0.4037 24.0 9360 0.4027 0.8819
0.3893 25.0 9750 0.3868 0.8893
0.3991 26.0 10140 0.3882 0.8846
0.3786 27.0 10530 0.3939 0.8859
0.3786 28.0 10920 0.3959 0.8848
0.38 29.0 11310 0.3950 0.8850
0.3764 30.0 11700 0.3783 0.8893
0.3708 31.0 12090 0.3799 0.8891
0.3708 32.0 12480 0.3915 0.8867
0.3656 33.0 12870 0.3780 0.8903
0.3617 34.0 13260 0.3805 0.8874
0.361 35.0 13650 0.3776 0.8920
0.3595 36.0 14040 0.3712 0.8888

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
2