gena-lm-bert-large-t2t_ft_BioS74_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of AIRI-Institute/gena-lm-bert-large-t2t on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6104
- F1 Score: 0.8508
- Precision: 0.7905
- Recall: 0.9211
- Accuracy: 0.8309
- Auc: 0.9082
- Prc: 0.9023
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.5815 | 0.1314 | 500 | 0.5055 | 0.7900 | 0.7764 | 0.8041 | 0.7762 | 0.8365 | 0.8345 |
0.5134 | 0.2629 | 1000 | 0.4808 | 0.7579 | 0.8332 | 0.6951 | 0.7676 | 0.8720 | 0.8626 |
0.4904 | 0.3943 | 1500 | 0.4344 | 0.8169 | 0.8103 | 0.8237 | 0.8067 | 0.8882 | 0.8847 |
0.4404 | 0.5258 | 2000 | 0.4801 | 0.8293 | 0.7525 | 0.9237 | 0.8009 | 0.8902 | 0.8859 |
0.429 | 0.6572 | 2500 | 0.4518 | 0.8328 | 0.7690 | 0.9081 | 0.8091 | 0.8927 | 0.8855 |
0.4476 | 0.7886 | 3000 | 0.4048 | 0.8323 | 0.8109 | 0.8548 | 0.8196 | 0.9020 | 0.9013 |
0.4312 | 0.9201 | 3500 | 0.4589 | 0.8375 | 0.7957 | 0.8840 | 0.8204 | 0.9013 | 0.8984 |
0.4323 | 1.0515 | 4000 | 0.4379 | 0.8264 | 0.8149 | 0.8383 | 0.8157 | 0.9014 | 0.8973 |
0.4131 | 1.1830 | 4500 | 0.4233 | 0.8331 | 0.7604 | 0.9211 | 0.8067 | 0.8992 | 0.8965 |
0.4114 | 1.3144 | 5000 | 0.4088 | 0.8360 | 0.7757 | 0.9066 | 0.8138 | 0.9018 | 0.8990 |
0.4108 | 1.4458 | 5500 | 0.4533 | 0.8298 | 0.8344 | 0.8252 | 0.8228 | 0.9045 | 0.9003 |
0.3896 | 1.5773 | 6000 | 0.4242 | 0.8395 | 0.8209 | 0.8589 | 0.8280 | 0.9066 | 0.9032 |
0.3956 | 1.7087 | 6500 | 0.3848 | 0.8336 | 0.8198 | 0.8478 | 0.8228 | 0.9094 | 0.9073 |
0.3783 | 1.8402 | 7000 | 0.3850 | 0.8330 | 0.8420 | 0.8242 | 0.8270 | 0.9099 | 0.9069 |
0.3917 | 1.9716 | 7500 | 0.4203 | 0.8448 | 0.8085 | 0.8845 | 0.8299 | 0.9096 | 0.9075 |
0.3692 | 2.1030 | 8000 | 0.4071 | 0.8466 | 0.8031 | 0.8950 | 0.8301 | 0.9090 | 0.9051 |
0.3651 | 2.2345 | 8500 | 0.4053 | 0.8447 | 0.8209 | 0.8699 | 0.8325 | 0.9131 | 0.9103 |
0.3731 | 2.3659 | 9000 | 0.4251 | 0.8084 | 0.8694 | 0.7554 | 0.8125 | 0.9120 | 0.9093 |
0.3572 | 2.4974 | 9500 | 0.5074 | 0.8465 | 0.7915 | 0.9096 | 0.8272 | 0.9113 | 0.9075 |
0.3444 | 2.6288 | 10000 | 0.4182 | 0.8435 | 0.8344 | 0.8528 | 0.8343 | 0.9127 | 0.9091 |
0.3626 | 2.7603 | 10500 | 0.4595 | 0.8518 | 0.7839 | 0.9327 | 0.8301 | 0.9131 | 0.9101 |
0.3443 | 2.8917 | 11000 | 0.5263 | 0.8354 | 0.8438 | 0.8272 | 0.8293 | 0.9140 | 0.9134 |
0.3475 | 3.0231 | 11500 | 0.5710 | 0.8482 | 0.8048 | 0.8965 | 0.8320 | 0.9086 | 0.9048 |
0.3155 | 3.1546 | 12000 | 0.6372 | 0.8494 | 0.8013 | 0.9036 | 0.8322 | 0.9113 | 0.9085 |
0.32 | 3.2860 | 12500 | 0.6357 | 0.8389 | 0.8218 | 0.8569 | 0.8278 | 0.9111 | 0.9093 |
0.3407 | 3.4175 | 13000 | 0.6104 | 0.8508 | 0.7905 | 0.9211 | 0.8309 | 0.9082 | 0.9023 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 4
Model tree for tanoManzo/gena-lm-bert-large-t2t_ft_BioS74_1kbpHG19_DHSs_H3K27AC
Base model
AIRI-Institute/gena-lm-bert-large-t2t