nucleotide-transformer-2.5b-multi-species_ft_BioS73_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-2.5b-multi-species on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5995
- F1 Score: 0.8891
- Precision: 0.9113
- Recall: 0.8680
- Accuracy: 0.8845
- Auc: 0.9509
- Prc: 0.9520
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.4838 | 0.1864 | 500 | 0.3717 | 0.8565 | 0.8426 | 0.8708 | 0.8442 | 0.9152 | 0.9120 |
0.3907 | 0.3727 | 1000 | 0.3634 | 0.8671 | 0.8058 | 0.9385 | 0.8464 | 0.9263 | 0.9224 |
0.3806 | 0.5591 | 1500 | 0.3516 | 0.8816 | 0.8351 | 0.9337 | 0.8662 | 0.9394 | 0.9356 |
0.3637 | 0.7454 | 2000 | 0.3030 | 0.8855 | 0.8503 | 0.9239 | 0.8725 | 0.9427 | 0.9399 |
0.3406 | 0.9318 | 2500 | 0.3303 | 0.8887 | 0.8406 | 0.9427 | 0.8740 | 0.9442 | 0.9410 |
0.2808 | 1.1182 | 3000 | 0.3769 | 0.8925 | 0.8919 | 0.8932 | 0.8852 | 0.9505 | 0.9477 |
0.2471 | 1.3045 | 3500 | 0.4407 | 0.8938 | 0.8614 | 0.9288 | 0.8822 | 0.9450 | 0.9401 |
0.2431 | 1.4909 | 4000 | 0.3490 | 0.8777 | 0.9107 | 0.8471 | 0.8740 | 0.9499 | 0.9478 |
0.241 | 1.6772 | 4500 | 0.3971 | 0.8967 | 0.8578 | 0.9392 | 0.8845 | 0.9496 | 0.9465 |
0.2328 | 1.8636 | 5000 | 0.4200 | 0.8941 | 0.8644 | 0.9260 | 0.8830 | 0.9495 | 0.9488 |
0.1813 | 2.0499 | 5500 | 0.7308 | 0.8918 | 0.85 | 0.9378 | 0.8785 | 0.9494 | 0.9485 |
0.0989 | 2.2363 | 6000 | 0.5995 | 0.8891 | 0.9113 | 0.8680 | 0.8845 | 0.9509 | 0.9520 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 1