5e-05_AmpGPT2 / README.md
wabu's picture
Update README.md
88afe39 verified
metadata
base_model: nferruz/ProtGPT2
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: 5e05_output_dir_clean_df_10-100_noX_100_50_epoch_cluster
    results: []
widget:
  - text: <|endoftext|>

5e05_output_dir_clean_df_10-100_noX_100_50_epoch_cluster

This model is a fine-tuned version of nferruz/ProtGPT2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.4181
  • Accuracy: 0.5481

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 148 5.1495 0.2912
No log 2.0 296 4.6761 0.3383
No log 3.0 444 4.3827 0.3712
4.9816 4.0 592 4.1762 0.3960
4.9816 5.0 740 4.0295 0.4136
4.9816 6.0 888 3.9151 0.4276
3.9068 7.0 1036 3.8298 0.4400
3.9068 8.0 1184 3.7551 0.4500
3.9068 9.0 1332 3.6968 0.4586
3.9068 10.0 1480 3.6389 0.4668
3.3777 11.0 1628 3.5960 0.4743
3.3777 12.0 1776 3.5560 0.4803
3.3777 13.0 1924 3.5380 0.4868
3.0075 14.0 2072 3.4846 0.4914
3.0075 15.0 2220 3.4658 0.4969
3.0075 16.0 2368 3.4555 0.5010
2.7329 17.0 2516 3.4300 0.5053
2.7329 18.0 2664 3.4208 0.5080
2.7329 19.0 2812 3.4250 0.5121
2.7329 20.0 2960 3.3964 0.5147
2.5153 21.0 3108 3.3893 0.5181
2.5153 22.0 3256 3.3914 0.5204
2.5153 23.0 3404 3.3819 0.5229
2.336 24.0 3552 3.3786 0.5247
2.336 25.0 3700 3.3749 0.5267
2.336 26.0 3848 3.3774 0.5287
2.336 27.0 3996 3.3700 0.5303
2.1918 28.0 4144 3.3722 0.5321
2.1918 29.0 4292 3.3729 0.5340
2.1918 30.0 4440 3.3896 0.5350
2.0717 31.0 4588 3.3776 0.5367
2.0717 32.0 4736 3.3842 0.5385
2.0717 33.0 4884 3.3820 0.5399
1.9814 34.0 5032 3.3933 0.5404
1.9814 35.0 5180 3.3861 0.5411
1.9814 36.0 5328 3.3878 0.5425
1.9814 37.0 5476 3.3903 0.5431
1.9049 38.0 5624 3.3848 0.5440
1.9049 39.0 5772 3.3965 0.5447
1.9049 40.0 5920 3.4033 0.5453
1.8441 41.0 6068 3.4074 0.5456
1.8441 42.0 6216 3.4046 0.5462
1.8441 43.0 6364 3.4120 0.5467
1.804 44.0 6512 3.4044 0.5467
1.804 45.0 6660 3.4125 0.5472
1.804 46.0 6808 3.4115 0.5477
1.804 47.0 6956 3.4070 0.5477
1.7744 48.0 7104 3.4203 0.5478
1.7744 49.0 7252 3.4174 0.5479
1.7744 50.0 7400 3.4181 0.5481

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.2.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0