Edit model card

GUE_EMP_H3K9ac-seqsight_16384_512_56M-L32_f

This model is a fine-tuned version of mahdibaghbanzadeh/seqsight_16384_512_56M on the mahdibaghbanzadeh/GUE_EMP_H3K9ac dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5010
  • F1 Score: 0.7846
  • Accuracy: 0.7841

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 10000

Training results

Training Loss Epoch Step Validation Loss F1 Score Accuracy
0.5565 1.15 200 0.5465 0.7333 0.7334
0.5011 2.3 400 0.5553 0.6999 0.7060
0.4765 3.45 600 0.5192 0.7452 0.7460
0.4689 4.6 800 0.5017 0.7538 0.7542
0.4619 5.75 1000 0.5046 0.7607 0.7611
0.4479 6.9 1200 0.4935 0.7728 0.7726
0.4407 8.05 1400 0.4994 0.7679 0.7675
0.4289 9.2 1600 0.5391 0.7429 0.7449
0.4197 10.34 1800 0.5165 0.7561 0.7567
0.413 11.49 2000 0.4956 0.7697 0.7693
0.4003 12.64 2200 0.4967 0.7658 0.7661
0.3972 13.79 2400 0.5274 0.7491 0.7510
0.3863 14.94 2600 0.4881 0.7713 0.7708
0.3783 16.09 2800 0.5760 0.7378 0.7413
0.3673 17.24 3000 0.5253 0.7624 0.7629
0.3608 18.39 3200 0.5385 0.7592 0.7593
0.3588 19.54 3400 0.5170 0.7635 0.7632
0.3431 20.69 3600 0.5149 0.7730 0.7726
0.3393 21.84 3800 0.5352 0.7704 0.7701
0.3278 22.99 4000 0.5680 0.7584 0.7589
0.3275 24.14 4200 0.5353 0.7673 0.7668
0.3126 25.29 4400 0.5789 0.7625 0.7625
0.3121 26.44 4600 0.5664 0.7674 0.7672
0.302 27.59 4800 0.5861 0.7533 0.7539
0.2934 28.74 5000 0.5784 0.7569 0.7567
0.2937 29.89 5200 0.5977 0.7534 0.7531
0.2812 31.03 5400 0.5971 0.7575 0.7575
0.2787 32.18 5600 0.6287 0.7487 0.7492
0.2675 33.33 5800 0.6269 0.7643 0.7639
0.2674 34.48 6000 0.6238 0.7590 0.7585
0.2552 35.63 6200 0.6466 0.7610 0.7611
0.2587 36.78 6400 0.6403 0.7590 0.7589
0.2477 37.93 6600 0.6421 0.7539 0.7542
0.2405 39.08 6800 0.6798 0.7376 0.7380
0.2391 40.23 7000 0.6509 0.7511 0.7513
0.2355 41.38 7200 0.6706 0.7572 0.7571
0.2281 42.53 7400 0.7032 0.7441 0.7449
0.2321 43.68 7600 0.6918 0.7460 0.7463
0.2237 44.83 7800 0.7034 0.7502 0.7499
0.2214 45.98 8000 0.6958 0.7582 0.7578
0.2179 47.13 8200 0.7049 0.7534 0.7531
0.2125 48.28 8400 0.7326 0.7488 0.7488
0.2101 49.43 8600 0.7270 0.7541 0.7539
0.2086 50.57 8800 0.7434 0.7493 0.7492
0.2076 51.72 9000 0.7319 0.7508 0.7506
0.2024 52.87 9200 0.7368 0.7509 0.7506
0.2052 54.02 9400 0.7500 0.7498 0.7496
0.2042 55.17 9600 0.7443 0.7500 0.7499
0.2046 56.32 9800 0.7369 0.7530 0.7528
0.2003 57.47 10000 0.7377 0.7545 0.7542

Framework versions

  • PEFT 0.9.0
  • Transformers 4.38.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
0
Unable to determine this model’s pipeline type. Check the docs .
Invalid base_model specified in model card metadata. Needs to be a model id from hf.co/models.