bdc2024-tpg-2
This model is a fine-tuned version of Mikask/bdc2024-tpg on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0953
- Accuracy: 0.9825
- Balanced Accuracy: 0.9863
- Precision: 0.9832
- Recall: 0.9825
- F1: 0.9826
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Balanced Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|---|
0.0781 | 1.0 | 900 | 0.0990 | 0.9803 | 0.9779 | 0.9810 | 0.9803 | 0.9804 |
0.0228 | 2.0 | 1800 | 0.0925 | 0.9782 | 0.9752 | 0.9788 | 0.9782 | 0.9782 |
0.017 | 3.0 | 2700 | 0.0953 | 0.9825 | 0.9863 | 0.9832 | 0.9825 | 0.9826 |
Framework versions
- Transformers 4.33.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.2
- Tokenizers 0.13.3
- Downloads last month
- 169
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for Mikask/bdc2024-tpg-2
Base model
Mikask/bdc2024-tpg