GUE_EMP_H3K9ac-seqsight_4096_512_15M-L1
This model is a fine-tuned version of mahdibaghbanzadeh/seqsight_4096_512_15M on the mahdibaghbanzadeh/GUE_EMP_H3K9ac dataset. It achieves the following results on the evaluation set:
- Loss: 0.6162
- F1 Score: 0.6557
- Accuracy: 0.6553
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 20000
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Accuracy |
---|---|---|---|---|---|
0.6888 | 2.3 | 400 | 0.6883 | 0.5303 | 0.5642 |
0.6442 | 4.6 | 800 | 0.6428 | 0.6118 | 0.6121 |
0.6266 | 6.9 | 1200 | 0.6346 | 0.6255 | 0.6247 |
0.6211 | 9.2 | 1600 | 0.6402 | 0.6197 | 0.6229 |
0.6168 | 11.49 | 2000 | 0.6335 | 0.6362 | 0.6394 |
0.61 | 13.79 | 2400 | 0.6288 | 0.6335 | 0.6330 |
0.6044 | 16.09 | 2800 | 0.6297 | 0.6398 | 0.6394 |
0.6026 | 18.39 | 3200 | 0.6310 | 0.6254 | 0.6272 |
0.6001 | 20.69 | 3600 | 0.6370 | 0.6361 | 0.6376 |
0.596 | 22.99 | 4000 | 0.6294 | 0.6328 | 0.6322 |
0.5919 | 25.29 | 4400 | 0.6361 | 0.6361 | 0.6355 |
0.5919 | 27.59 | 4800 | 0.6434 | 0.6336 | 0.6330 |
0.5903 | 29.89 | 5200 | 0.6384 | 0.6377 | 0.6369 |
0.5888 | 32.18 | 5600 | 0.6370 | 0.6344 | 0.6348 |
0.5899 | 34.48 | 6000 | 0.6334 | 0.6292 | 0.6301 |
0.583 | 36.78 | 6400 | 0.6361 | 0.6299 | 0.6312 |
0.5838 | 39.08 | 6800 | 0.6456 | 0.6316 | 0.6326 |
0.5832 | 41.38 | 7200 | 0.6362 | 0.6337 | 0.6330 |
0.5827 | 43.68 | 7600 | 0.6370 | 0.6300 | 0.6297 |
0.5814 | 45.98 | 8000 | 0.6459 | 0.6304 | 0.6297 |
0.5773 | 48.28 | 8400 | 0.6421 | 0.6279 | 0.6272 |
0.5785 | 50.57 | 8800 | 0.6469 | 0.6319 | 0.6312 |
0.5753 | 52.87 | 9200 | 0.6545 | 0.6290 | 0.6283 |
0.575 | 55.17 | 9600 | 0.6464 | 0.6276 | 0.6268 |
0.5749 | 57.47 | 10000 | 0.6404 | 0.6233 | 0.6225 |
0.5753 | 59.77 | 10400 | 0.6538 | 0.6218 | 0.6243 |
0.5743 | 62.07 | 10800 | 0.6419 | 0.6272 | 0.6265 |
0.5726 | 64.37 | 11200 | 0.6527 | 0.6240 | 0.6236 |
0.5736 | 66.67 | 11600 | 0.6491 | 0.6240 | 0.6232 |
0.5703 | 68.97 | 12000 | 0.6464 | 0.6291 | 0.6283 |
0.571 | 71.26 | 12400 | 0.6503 | 0.6294 | 0.6297 |
0.5712 | 73.56 | 12800 | 0.6503 | 0.6272 | 0.6265 |
0.5696 | 75.86 | 13200 | 0.6531 | 0.6319 | 0.6312 |
0.5723 | 78.16 | 13600 | 0.6531 | 0.6289 | 0.6283 |
0.5701 | 80.46 | 14000 | 0.6445 | 0.6250 | 0.6250 |
0.5694 | 82.76 | 14400 | 0.6443 | 0.6319 | 0.6312 |
0.57 | 85.06 | 14800 | 0.6482 | 0.6283 | 0.6276 |
0.5673 | 87.36 | 15200 | 0.6438 | 0.6272 | 0.6265 |
0.5675 | 89.66 | 15600 | 0.6509 | 0.6290 | 0.6286 |
0.5679 | 91.95 | 16000 | 0.6587 | 0.6272 | 0.6265 |
0.5658 | 94.25 | 16400 | 0.6538 | 0.6245 | 0.6240 |
0.5679 | 96.55 | 16800 | 0.6530 | 0.6256 | 0.6250 |
0.5669 | 98.85 | 17200 | 0.6499 | 0.6261 | 0.6254 |
0.5653 | 101.15 | 17600 | 0.6569 | 0.6253 | 0.6247 |
0.5666 | 103.45 | 18000 | 0.6499 | 0.6234 | 0.6232 |
0.565 | 105.75 | 18400 | 0.6532 | 0.6232 | 0.6229 |
0.5645 | 108.05 | 18800 | 0.6545 | 0.6258 | 0.6250 |
0.5664 | 110.34 | 19200 | 0.6515 | 0.6250 | 0.6243 |
0.5655 | 112.64 | 19600 | 0.6526 | 0.6255 | 0.6247 |
0.5637 | 114.94 | 20000 | 0.6524 | 0.6265 | 0.6258 |
Framework versions
- PEFT 0.9.0
- Transformers 4.38.2
- Pytorch 2.2.0+cu121
- Datasets 2.17.1
- Tokenizers 0.15.2
- Downloads last month
- 1
Unable to determine this model’s pipeline type. Check the
docs
.