GUE_EMP_H3K4me3-seqsight_4096_512_15M-L32
This model is a fine-tuned version of mahdibaghbanzadeh/seqsight_4096_512_15M on the mahdibaghbanzadeh/GUE_EMP_H3K4me3 dataset. It achieves the following results on the evaluation set:
- Loss: 0.6806
- F1 Score: 0.5702
- Accuracy: 0.5723
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 20000
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Accuracy |
---|---|---|---|---|---|
0.6905 | 1.74 | 400 | 0.6839 | 0.5471 | 0.5649 |
0.6747 | 3.48 | 800 | 0.6820 | 0.5514 | 0.5715 |
0.6679 | 5.22 | 1200 | 0.6863 | 0.5658 | 0.5663 |
0.6629 | 6.96 | 1600 | 0.6866 | 0.5682 | 0.5698 |
0.6542 | 8.7 | 2000 | 0.6890 | 0.5602 | 0.5630 |
0.6488 | 10.43 | 2400 | 0.6938 | 0.5551 | 0.5620 |
0.6442 | 12.17 | 2800 | 0.6980 | 0.5515 | 0.5541 |
0.6397 | 13.91 | 3200 | 0.7055 | 0.5401 | 0.5451 |
0.6319 | 15.65 | 3600 | 0.7078 | 0.5516 | 0.5530 |
0.6263 | 17.39 | 4000 | 0.7603 | 0.5457 | 0.5454 |
0.6222 | 19.13 | 4400 | 0.7448 | 0.5388 | 0.5476 |
0.6175 | 20.87 | 4800 | 0.7215 | 0.5428 | 0.5429 |
0.6128 | 22.61 | 5200 | 0.7416 | 0.5464 | 0.5467 |
0.6108 | 24.35 | 5600 | 0.7456 | 0.5414 | 0.5451 |
0.6093 | 26.09 | 6000 | 0.7661 | 0.5462 | 0.5484 |
0.6044 | 27.83 | 6400 | 0.7545 | 0.5457 | 0.5470 |
0.6005 | 29.57 | 6800 | 0.7592 | 0.5428 | 0.5427 |
0.5986 | 31.3 | 7200 | 0.7531 | 0.5426 | 0.5486 |
0.5939 | 33.04 | 7600 | 0.7642 | 0.5417 | 0.5427 |
0.5884 | 34.78 | 8000 | 0.7383 | 0.5352 | 0.5389 |
0.5859 | 36.52 | 8400 | 0.7679 | 0.5379 | 0.5378 |
0.5856 | 38.26 | 8800 | 0.7596 | 0.5427 | 0.5451 |
0.583 | 40.0 | 9200 | 0.7603 | 0.5509 | 0.5519 |
0.5794 | 41.74 | 9600 | 0.7692 | 0.5431 | 0.5467 |
0.5778 | 43.48 | 10000 | 0.7632 | 0.5458 | 0.5495 |
0.5738 | 45.22 | 10400 | 0.7483 | 0.5376 | 0.5448 |
0.5718 | 46.96 | 10800 | 0.7720 | 0.5438 | 0.5435 |
0.5669 | 48.7 | 11200 | 0.7775 | 0.5473 | 0.5473 |
0.5664 | 50.43 | 11600 | 0.7911 | 0.5441 | 0.5486 |
0.5625 | 52.17 | 12000 | 0.7856 | 0.5484 | 0.5516 |
0.5618 | 53.91 | 12400 | 0.7766 | 0.5435 | 0.5486 |
0.5585 | 55.65 | 12800 | 0.7843 | 0.5505 | 0.5524 |
0.5564 | 57.39 | 13200 | 0.7953 | 0.5410 | 0.5451 |
0.5565 | 59.13 | 13600 | 0.7947 | 0.5421 | 0.5446 |
0.5516 | 60.87 | 14000 | 0.7932 | 0.5439 | 0.5465 |
0.5492 | 62.61 | 14400 | 0.7903 | 0.5445 | 0.5470 |
0.5459 | 64.35 | 14800 | 0.7897 | 0.5481 | 0.5495 |
0.5443 | 66.09 | 15200 | 0.8035 | 0.5400 | 0.5440 |
0.5442 | 67.83 | 15600 | 0.8019 | 0.5412 | 0.5489 |
0.5393 | 69.57 | 16000 | 0.8215 | 0.5422 | 0.5424 |
0.5379 | 71.3 | 16400 | 0.8150 | 0.5420 | 0.5437 |
0.5395 | 73.04 | 16800 | 0.8073 | 0.5428 | 0.5462 |
0.537 | 74.78 | 17200 | 0.7981 | 0.5461 | 0.5465 |
0.536 | 76.52 | 17600 | 0.8180 | 0.5480 | 0.5522 |
0.5335 | 78.26 | 18000 | 0.8268 | 0.5480 | 0.5522 |
0.5345 | 80.0 | 18400 | 0.8107 | 0.5476 | 0.5481 |
0.5312 | 81.74 | 18800 | 0.8222 | 0.5474 | 0.5492 |
0.531 | 83.48 | 19200 | 0.8179 | 0.5464 | 0.5486 |
0.5319 | 85.22 | 19600 | 0.8097 | 0.5469 | 0.5486 |
0.532 | 86.96 | 20000 | 0.8145 | 0.5457 | 0.5478 |
Framework versions
- PEFT 0.9.0
- Transformers 4.38.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 1
Unable to determine this model’s pipeline type. Check the
docs
.