|
--- |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: dit-small_tobacco3482_kd_CEKD_t2.5_a0.5 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# dit-small_tobacco3482_kd_CEKD_t2.5_a0.5 |
|
|
|
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 3.8936 |
|
- Accuracy: 0.185 |
|
- Brier Loss: 0.8707 |
|
- Nll: 6.6284 |
|
- F1 Micro: 0.185 |
|
- F1 Macro: 0.0488 |
|
- Ece: 0.2527 |
|
- Aurc: 0.7434 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 16 |
|
- total_train_batch_size: 256 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 25 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| |
|
| No log | 0.96 | 3 | 4.2363 | 0.06 | 0.9043 | 9.2962 | 0.06 | 0.0114 | 0.1758 | 0.9032 | |
|
| No log | 1.96 | 6 | 4.1268 | 0.18 | 0.8887 | 6.8683 | 0.18 | 0.0305 | 0.2329 | 0.8055 | |
|
| No log | 2.96 | 9 | 4.0044 | 0.18 | 0.8773 | 7.3055 | 0.18 | 0.0305 | 0.2510 | 0.8219 | |
|
| No log | 3.96 | 12 | 3.9678 | 0.18 | 0.8851 | 7.2435 | 0.18 | 0.0305 | 0.2677 | 0.8214 | |
|
| No log | 4.96 | 15 | 3.9645 | 0.185 | 0.8877 | 6.9806 | 0.185 | 0.0488 | 0.2757 | 0.7934 | |
|
| No log | 5.96 | 18 | 3.9635 | 0.185 | 0.8853 | 6.9543 | 0.185 | 0.0488 | 0.2551 | 0.7812 | |
|
| No log | 6.96 | 21 | 3.9564 | 0.185 | 0.8801 | 6.0556 | 0.185 | 0.0488 | 0.2515 | 0.7771 | |
|
| No log | 7.96 | 24 | 3.9505 | 0.185 | 0.8772 | 6.0356 | 0.185 | 0.0488 | 0.2598 | 0.7724 | |
|
| No log | 8.96 | 27 | 3.9435 | 0.185 | 0.8751 | 6.0288 | 0.185 | 0.0488 | 0.2590 | 0.7697 | |
|
| No log | 9.96 | 30 | 3.9383 | 0.185 | 0.8742 | 6.0724 | 0.185 | 0.0488 | 0.2474 | 0.7712 | |
|
| No log | 10.96 | 33 | 3.9336 | 0.185 | 0.8746 | 6.7953 | 0.185 | 0.0488 | 0.2533 | 0.7685 | |
|
| No log | 11.96 | 36 | 3.9298 | 0.185 | 0.8755 | 6.9469 | 0.185 | 0.0488 | 0.2679 | 0.7659 | |
|
| No log | 12.96 | 39 | 3.9253 | 0.185 | 0.8756 | 6.9654 | 0.185 | 0.0488 | 0.2591 | 0.7640 | |
|
| No log | 13.96 | 42 | 3.9194 | 0.185 | 0.8750 | 6.9522 | 0.185 | 0.0488 | 0.2681 | 0.7604 | |
|
| No log | 14.96 | 45 | 3.9128 | 0.185 | 0.8744 | 6.9200 | 0.185 | 0.0488 | 0.2611 | 0.7617 | |
|
| No log | 15.96 | 48 | 3.9074 | 0.185 | 0.8733 | 6.8369 | 0.185 | 0.0488 | 0.2611 | 0.7600 | |
|
| No log | 16.96 | 51 | 3.9041 | 0.185 | 0.8726 | 6.8278 | 0.185 | 0.0488 | 0.2558 | 0.7566 | |
|
| No log | 17.96 | 54 | 3.9025 | 0.185 | 0.8719 | 6.7039 | 0.185 | 0.0488 | 0.2588 | 0.7510 | |
|
| No log | 18.96 | 57 | 3.9012 | 0.185 | 0.8717 | 6.6384 | 0.185 | 0.0488 | 0.2580 | 0.7484 | |
|
| No log | 19.96 | 60 | 3.8987 | 0.185 | 0.8712 | 6.6323 | 0.185 | 0.0488 | 0.2612 | 0.7450 | |
|
| No log | 20.96 | 63 | 3.8971 | 0.185 | 0.8712 | 6.6319 | 0.185 | 0.0488 | 0.2615 | 0.7443 | |
|
| No log | 21.96 | 66 | 3.8956 | 0.185 | 0.8710 | 6.6323 | 0.185 | 0.0488 | 0.2659 | 0.7439 | |
|
| No log | 22.96 | 69 | 3.8945 | 0.185 | 0.8708 | 6.6307 | 0.185 | 0.0488 | 0.2569 | 0.7436 | |
|
| No log | 23.96 | 72 | 3.8940 | 0.185 | 0.8708 | 6.6295 | 0.185 | 0.0488 | 0.2526 | 0.7434 | |
|
| No log | 24.96 | 75 | 3.8936 | 0.185 | 0.8707 | 6.6284 | 0.185 | 0.0488 | 0.2527 | 0.7434 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.26.1 |
|
- Pytorch 1.13.1.post200 |
|
- Datasets 2.9.0 |
|
- Tokenizers 0.13.2 |
|
|