|
--- |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9 |
|
|
|
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 2.5379 |
|
- Accuracy: 0.18 |
|
- Brier Loss: 0.8746 |
|
- Nll: 6.7389 |
|
- F1 Micro: 0.18 |
|
- F1 Macro: 0.0306 |
|
- Ece: 0.2460 |
|
- Aurc: 0.8496 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 16 |
|
- total_train_batch_size: 256 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 25 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:| |
|
| No log | 0.96 | 3 | 2.6891 | 0.145 | 0.8999 | 10.1550 | 0.145 | 0.0253 | 0.2220 | 0.8466 | |
|
| No log | 1.96 | 6 | 2.6592 | 0.145 | 0.8947 | 10.5706 | 0.145 | 0.0253 | 0.2238 | 0.8463 | |
|
| No log | 2.96 | 9 | 2.6158 | 0.14 | 0.8869 | 8.5528 | 0.14 | 0.0422 | 0.2066 | 0.8175 | |
|
| No log | 3.96 | 12 | 2.5827 | 0.175 | 0.8810 | 6.5464 | 0.175 | 0.0467 | 0.2385 | 0.8661 | |
|
| No log | 4.96 | 15 | 2.5647 | 0.155 | 0.8781 | 6.8570 | 0.155 | 0.0274 | 0.2316 | 0.8886 | |
|
| No log | 5.96 | 18 | 2.5566 | 0.19 | 0.8772 | 8.4283 | 0.19 | 0.0413 | 0.2460 | 0.8532 | |
|
| No log | 6.96 | 21 | 2.5515 | 0.18 | 0.8769 | 7.6865 | 0.18 | 0.0308 | 0.2480 | 0.8517 | |
|
| No log | 7.96 | 24 | 2.5475 | 0.18 | 0.8767 | 6.9727 | 0.18 | 0.0306 | 0.2469 | 0.8521 | |
|
| No log | 8.96 | 27 | 2.5438 | 0.18 | 0.8762 | 6.9080 | 0.18 | 0.0306 | 0.2438 | 0.8525 | |
|
| No log | 9.96 | 30 | 2.5420 | 0.18 | 0.8758 | 6.8906 | 0.18 | 0.0306 | 0.2521 | 0.8528 | |
|
| No log | 10.96 | 33 | 2.5410 | 0.18 | 0.8755 | 6.8317 | 0.18 | 0.0306 | 0.2516 | 0.8524 | |
|
| No log | 11.96 | 36 | 2.5404 | 0.18 | 0.8753 | 6.7606 | 0.18 | 0.0306 | 0.2469 | 0.8516 | |
|
| No log | 12.96 | 39 | 2.5401 | 0.18 | 0.8752 | 6.7444 | 0.18 | 0.0306 | 0.2425 | 0.8516 | |
|
| No log | 13.96 | 42 | 2.5397 | 0.18 | 0.8751 | 6.7397 | 0.18 | 0.0306 | 0.2498 | 0.8514 | |
|
| No log | 14.96 | 45 | 2.5393 | 0.18 | 0.8750 | 6.7390 | 0.18 | 0.0306 | 0.2579 | 0.8511 | |
|
| No log | 15.96 | 48 | 2.5389 | 0.18 | 0.8749 | 6.7366 | 0.18 | 0.0306 | 0.2463 | 0.8513 | |
|
| No log | 16.96 | 51 | 2.5387 | 0.18 | 0.8749 | 6.7390 | 0.18 | 0.0306 | 0.2465 | 0.8510 | |
|
| No log | 17.96 | 54 | 2.5389 | 0.18 | 0.8749 | 6.7382 | 0.18 | 0.0306 | 0.2425 | 0.8505 | |
|
| No log | 18.96 | 57 | 2.5389 | 0.18 | 0.8749 | 6.7397 | 0.18 | 0.0306 | 0.2463 | 0.8504 | |
|
| No log | 19.96 | 60 | 2.5384 | 0.18 | 0.8748 | 6.7391 | 0.18 | 0.0306 | 0.2421 | 0.8495 | |
|
| No log | 20.96 | 63 | 2.5383 | 0.18 | 0.8747 | 6.7396 | 0.18 | 0.0306 | 0.2422 | 0.8500 | |
|
| No log | 21.96 | 66 | 2.5380 | 0.18 | 0.8747 | 6.7399 | 0.18 | 0.0306 | 0.2460 | 0.8496 | |
|
| No log | 22.96 | 69 | 2.5379 | 0.18 | 0.8746 | 6.7395 | 0.18 | 0.0306 | 0.2460 | 0.8497 | |
|
| No log | 23.96 | 72 | 2.5379 | 0.18 | 0.8746 | 6.7393 | 0.18 | 0.0306 | 0.2460 | 0.8497 | |
|
| No log | 24.96 | 75 | 2.5379 | 0.18 | 0.8746 | 6.7389 | 0.18 | 0.0306 | 0.2460 | 0.8496 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.26.1 |
|
- Pytorch 1.13.1.post200 |
|
- Datasets 2.9.0 |
|
- Tokenizers 0.13.2 |
|
|