content
This model is a fine-tuned version of zhihan1996/DNABERT-2-117M on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2180
- F1 Macro: 0.3388
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 1000
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Macro |
---|---|---|---|---|
1.4189 | 0.2703 | 100 | 1.5083 | 0.0814 |
1.4284 | 0.5405 | 200 | 1.3513 | 0.1275 |
1.2829 | 0.8108 | 300 | 1.3179 | 0.1390 |
1.2192 | 1.0811 | 400 | 1.3522 | 0.2334 |
1.3097 | 1.3514 | 500 | 1.2843 | 0.2224 |
1.1668 | 1.6216 | 600 | 1.2668 | 0.2025 |
1.1595 | 1.8919 | 700 | 1.2268 | 0.2690 |
1.1336 | 2.1622 | 800 | 1.2596 | 0.2985 |
1.063 | 2.4324 | 900 | 1.2370 | 0.2709 |
1.0497 | 2.7027 | 1000 | 1.2180 | 0.3388 |
Framework versions
- Transformers 4.41.0
- Pytorch 1.13.1+cu117
- Datasets 2.19.1
- Tokenizers 0.19.1
Model tree for hyoo14/content
Base model
zhihan1996/DNABERT-2-117M