|
--- |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: dit-small_tobacco3482_kd_CEKD_t5.0_a0.9 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# dit-small_tobacco3482_kd_CEKD_t5.0_a0.9 |
|
|
|
This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 2.4735 |
|
- Accuracy: 0.19 |
|
- Brier Loss: 0.8651 |
|
- Nll: 6.3618 |
|
- F1 Micro: 0.19 |
|
- F1 Macro: 0.0641 |
|
- Ece: 0.2456 |
|
- Aurc: 0.7331 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 16 |
|
- total_train_batch_size: 256 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 25 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:| |
|
| No log | 0.96 | 3 | 2.6674 | 0.06 | 0.9042 | 9.2824 | 0.06 | 0.0114 | 0.1749 | 0.9042 | |
|
| No log | 1.96 | 6 | 2.5911 | 0.18 | 0.8886 | 6.4746 | 0.18 | 0.0305 | 0.2317 | 0.8026 | |
|
| No log | 2.96 | 9 | 2.5252 | 0.18 | 0.8764 | 7.5079 | 0.18 | 0.0305 | 0.2390 | 0.8141 | |
|
| No log | 3.96 | 12 | 2.5235 | 0.185 | 0.8777 | 6.9489 | 0.185 | 0.0488 | 0.2553 | 0.7838 | |
|
| No log | 4.96 | 15 | 2.5223 | 0.185 | 0.8754 | 6.8606 | 0.185 | 0.0488 | 0.2572 | 0.7773 | |
|
| No log | 5.96 | 18 | 2.5213 | 0.185 | 0.8732 | 5.9794 | 0.185 | 0.0488 | 0.2384 | 0.7684 | |
|
| No log | 6.96 | 21 | 2.5203 | 0.185 | 0.8723 | 5.9244 | 0.185 | 0.0488 | 0.2406 | 0.7603 | |
|
| No log | 7.96 | 24 | 2.5149 | 0.185 | 0.8713 | 5.9034 | 0.185 | 0.0488 | 0.2484 | 0.7560 | |
|
| No log | 8.96 | 27 | 2.5064 | 0.185 | 0.8701 | 5.9325 | 0.185 | 0.0488 | 0.2525 | 0.7529 | |
|
| No log | 9.96 | 30 | 2.5014 | 0.185 | 0.8695 | 6.7123 | 0.185 | 0.0488 | 0.2399 | 0.7528 | |
|
| No log | 10.96 | 33 | 2.4977 | 0.185 | 0.8693 | 6.7598 | 0.185 | 0.0488 | 0.2487 | 0.7511 | |
|
| No log | 11.96 | 36 | 2.4944 | 0.185 | 0.8692 | 6.8130 | 0.185 | 0.0488 | 0.2488 | 0.7476 | |
|
| No log | 12.96 | 39 | 2.4908 | 0.185 | 0.8688 | 6.7610 | 0.185 | 0.0488 | 0.2488 | 0.7452 | |
|
| No log | 13.96 | 42 | 2.4867 | 0.185 | 0.8680 | 6.6686 | 0.185 | 0.0488 | 0.2484 | 0.7428 | |
|
| No log | 14.96 | 45 | 2.4830 | 0.185 | 0.8673 | 6.6283 | 0.185 | 0.0488 | 0.2426 | 0.7431 | |
|
| No log | 15.96 | 48 | 2.4805 | 0.185 | 0.8668 | 6.4857 | 0.185 | 0.0488 | 0.2385 | 0.7410 | |
|
| No log | 16.96 | 51 | 2.4794 | 0.185 | 0.8666 | 6.4425 | 0.185 | 0.0488 | 0.2459 | 0.7385 | |
|
| No log | 17.96 | 54 | 2.4795 | 0.185 | 0.8664 | 6.0769 | 0.185 | 0.0488 | 0.2406 | 0.7352 | |
|
| No log | 18.96 | 57 | 2.4793 | 0.185 | 0.8664 | 6.1000 | 0.185 | 0.0488 | 0.2402 | 0.7355 | |
|
| No log | 19.96 | 60 | 2.4774 | 0.185 | 0.8660 | 6.3802 | 0.185 | 0.0488 | 0.2506 | 0.7346 | |
|
| No log | 20.96 | 63 | 2.4762 | 0.185 | 0.8657 | 6.4330 | 0.185 | 0.0488 | 0.2550 | 0.7344 | |
|
| No log | 21.96 | 66 | 2.4750 | 0.185 | 0.8654 | 6.3721 | 0.185 | 0.0488 | 0.2513 | 0.7333 | |
|
| No log | 22.96 | 69 | 2.4741 | 0.19 | 0.8652 | 6.3676 | 0.19 | 0.0641 | 0.2453 | 0.7332 | |
|
| No log | 23.96 | 72 | 2.4738 | 0.19 | 0.8652 | 6.3645 | 0.19 | 0.0641 | 0.2455 | 0.7331 | |
|
| No log | 24.96 | 75 | 2.4735 | 0.19 | 0.8651 | 6.3618 | 0.19 | 0.0641 | 0.2456 | 0.7331 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.26.1 |
|
- Pytorch 1.13.1.post200 |
|
- Datasets 2.9.0 |
|
- Tokenizers 0.13.2 |
|
|