dna_bert_3_2-finetuned
This model is a fine-tuned version of armheb/DNA_bert_3 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4668
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.8974 | 1.0 | 62 | 0.6160 |
0.6057 | 2.0 | 124 | 0.6000 |
0.5957 | 3.0 | 186 | 0.5897 |
0.5883 | 4.0 | 248 | 0.5873 |
0.5844 | 5.0 | 310 | 0.5843 |
0.5812 | 6.0 | 372 | 0.5811 |
0.5812 | 7.0 | 434 | 0.5832 |
0.5769 | 8.0 | 496 | 0.5773 |
0.5727 | 9.0 | 558 | 0.5771 |
0.5702 | 10.0 | 620 | 0.5772 |
0.5673 | 11.0 | 682 | 0.5771 |
0.5663 | 12.0 | 744 | 0.5769 |
0.5569 | 13.0 | 806 | 0.5731 |
0.5518 | 14.0 | 868 | 0.5731 |
0.5486 | 15.0 | 930 | 0.5728 |
0.544 | 16.0 | 992 | 0.5683 |
0.5336 | 17.0 | 1054 | 0.5694 |
0.5245 | 18.0 | 1116 | 0.5639 |
0.5162 | 19.0 | 1178 | 0.5641 |
0.5057 | 20.0 | 1240 | 0.5626 |
0.4966 | 21.0 | 1302 | 0.5612 |
0.4859 | 22.0 | 1364 | 0.5492 |
0.4781 | 23.0 | 1426 | 0.5470 |
0.4601 | 24.0 | 1488 | 0.5399 |
0.4523 | 25.0 | 1550 | 0.5424 |
0.4432 | 26.0 | 1612 | 0.5328 |
0.4341 | 27.0 | 1674 | 0.5336 |
0.4183 | 28.0 | 1736 | 0.5315 |
0.4133 | 29.0 | 1798 | 0.5268 |
0.4111 | 30.0 | 1860 | 0.5256 |
0.3919 | 31.0 | 1922 | 0.5155 |
0.3899 | 32.0 | 1984 | 0.5179 |
0.3804 | 33.0 | 2046 | 0.5145 |
0.368 | 34.0 | 2108 | 0.5189 |
0.3603 | 35.0 | 2170 | 0.5081 |
0.3602 | 36.0 | 2232 | 0.5098 |
0.352 | 37.0 | 2294 | 0.5054 |
0.3468 | 38.0 | 2356 | 0.5024 |
0.3359 | 39.0 | 2418 | 0.5053 |
0.3342 | 40.0 | 2480 | 0.5031 |
0.3294 | 41.0 | 2542 | 0.4978 |
0.3158 | 42.0 | 2604 | 0.4923 |
0.3191 | 43.0 | 2666 | 0.4944 |
0.3122 | 44.0 | 2728 | 0.4970 |
0.3084 | 45.0 | 2790 | 0.4910 |
0.2978 | 46.0 | 2852 | 0.4898 |
0.3012 | 47.0 | 2914 | 0.4880 |
0.2938 | 48.0 | 2976 | 0.4924 |
0.2932 | 49.0 | 3038 | 0.4879 |
0.2842 | 50.0 | 3100 | 0.4847 |
0.2828 | 51.0 | 3162 | 0.4849 |
0.2793 | 52.0 | 3224 | 0.4767 |
0.2753 | 53.0 | 3286 | 0.4796 |
0.2725 | 54.0 | 3348 | 0.4829 |
0.2695 | 55.0 | 3410 | 0.4831 |
0.2671 | 56.0 | 3472 | 0.4791 |
0.2664 | 57.0 | 3534 | 0.4791 |
0.2563 | 58.0 | 3596 | 0.4765 |
0.2583 | 59.0 | 3658 | 0.4742 |
0.2535 | 60.0 | 3720 | 0.4766 |
0.2496 | 61.0 | 3782 | 0.4741 |
0.2489 | 62.0 | 3844 | 0.4766 |
0.2444 | 63.0 | 3906 | 0.4748 |
0.2417 | 64.0 | 3968 | 0.4768 |
0.2422 | 65.0 | 4030 | 0.4727 |
0.2404 | 66.0 | 4092 | 0.4729 |
0.2405 | 67.0 | 4154 | 0.4744 |
0.2353 | 68.0 | 4216 | 0.4729 |
0.2307 | 69.0 | 4278 | 0.4705 |
0.2281 | 70.0 | 4340 | 0.4717 |
0.232 | 71.0 | 4402 | 0.4719 |
0.2313 | 72.0 | 4464 | 0.4713 |
0.2266 | 73.0 | 4526 | 0.4726 |
0.2241 | 74.0 | 4588 | 0.4675 |
0.2256 | 75.0 | 4650 | 0.4688 |
0.2299 | 76.0 | 4712 | 0.4713 |
0.2199 | 77.0 | 4774 | 0.4720 |
0.2228 | 78.0 | 4836 | 0.4682 |
0.2261 | 79.0 | 4898 | 0.4676 |
0.2167 | 80.0 | 4960 | 0.4685 |
0.2126 | 81.0 | 5022 | 0.4676 |
0.2217 | 82.0 | 5084 | 0.4672 |
0.216 | 83.0 | 5146 | 0.4672 |
0.2152 | 84.0 | 5208 | 0.4682 |
0.219 | 85.0 | 5270 | 0.4663 |
0.2135 | 86.0 | 5332 | 0.4655 |
0.2046 | 87.0 | 5394 | 0.4644 |
0.2177 | 88.0 | 5456 | 0.4679 |
0.2052 | 89.0 | 5518 | 0.4659 |
0.2147 | 90.0 | 5580 | 0.4665 |
0.211 | 91.0 | 5642 | 0.4668 |
0.2089 | 92.0 | 5704 | 0.4649 |
0.2149 | 93.0 | 5766 | 0.4651 |
0.2034 | 94.0 | 5828 | 0.4689 |
0.2071 | 95.0 | 5890 | 0.4659 |
0.2145 | 96.0 | 5952 | 0.4664 |
0.2036 | 97.0 | 6014 | 0.4661 |
0.2092 | 98.0 | 6076 | 0.4676 |
0.2079 | 99.0 | 6138 | 0.4667 |
0.2081 | 100.0 | 6200 | 0.4668 |
Framework versions
- Transformers 4.21.1
- Pytorch 1.12.0+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.