File size: 4,714 Bytes
daa2e80
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dit-small_tobacco3482_kd_CEKD_t5.0_a0.5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dit-small_tobacco3482_kd_CEKD_t5.0_a0.5

This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.7912
- Accuracy: 0.185
- Brier Loss: 0.8688
- Nll: 5.6106
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2524
- Aurc: 0.7391

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll    | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log        | 0.96  | 3    | 4.0715          | 0.06     | 0.9043     | 8.8976 | 0.06     | 0.0114   | 0.1751 | 0.9034 |
| No log        | 1.96  | 6    | 3.9774          | 0.18     | 0.8893     | 8.0316 | 0.18     | 0.0305   | 0.2237 | 0.8040 |
| No log        | 2.96  | 9    | 3.8805          | 0.18     | 0.8782     | 8.6752 | 0.18     | 0.0305   | 0.2566 | 0.8189 |
| No log        | 3.96  | 12   | 3.8615          | 0.18     | 0.8836     | 8.9177 | 0.18     | 0.0305   | 0.2645 | 0.8205 |
| No log        | 4.96  | 15   | 3.8624          | 0.185    | 0.8844     | 6.3245 | 0.185    | 0.0488   | 0.2727 | 0.7889 |
| No log        | 5.96  | 18   | 3.8605          | 0.185    | 0.8813     | 5.1679 | 0.185    | 0.0488   | 0.2558 | 0.7797 |
| No log        | 6.96  | 21   | 3.8511          | 0.185    | 0.8774     | 5.1770 | 0.185    | 0.0488   | 0.2510 | 0.7741 |
| No log        | 7.96  | 24   | 3.8410          | 0.185    | 0.8751     | 5.6014 | 0.185    | 0.0488   | 0.2458 | 0.7699 |
| No log        | 8.96  | 27   | 3.8317          | 0.185    | 0.8733     | 5.9766 | 0.185    | 0.0488   | 0.2537 | 0.7681 |
| No log        | 9.96  | 30   | 3.8259          | 0.185    | 0.8724     | 6.0278 | 0.185    | 0.0488   | 0.2473 | 0.7689 |
| No log        | 10.96 | 33   | 3.8226          | 0.185    | 0.8724     | 6.8070 | 0.185    | 0.0488   | 0.2618 | 0.7671 |
| No log        | 11.96 | 36   | 3.8209          | 0.185    | 0.8730     | 7.6044 | 0.185    | 0.0488   | 0.2539 | 0.7643 |
| No log        | 12.96 | 39   | 3.8187          | 0.185    | 0.8730     | 8.1654 | 0.185    | 0.0488   | 0.2542 | 0.7612 |
| No log        | 13.96 | 42   | 3.8147          | 0.185    | 0.8725     | 7.1073 | 0.185    | 0.0488   | 0.2542 | 0.7566 |
| No log        | 14.96 | 45   | 3.8096          | 0.185    | 0.8720     | 6.3875 | 0.185    | 0.0488   | 0.2565 | 0.7566 |
| No log        | 15.96 | 48   | 3.8052          | 0.185    | 0.8712     | 6.0256 | 0.185    | 0.0488   | 0.2518 | 0.7524 |
| No log        | 16.96 | 51   | 3.8022          | 0.185    | 0.8707     | 5.7809 | 0.185    | 0.0488   | 0.2558 | 0.7485 |
| No log        | 17.96 | 54   | 3.8008          | 0.185    | 0.8701     | 5.6835 | 0.185    | 0.0488   | 0.2496 | 0.7442 |
| No log        | 18.96 | 57   | 3.7992          | 0.185    | 0.8700     | 5.3867 | 0.185    | 0.0488   | 0.2490 | 0.7421 |
| No log        | 19.96 | 60   | 3.7965          | 0.185    | 0.8694     | 5.4928 | 0.185    | 0.0488   | 0.2478 | 0.7406 |
| No log        | 20.96 | 63   | 3.7948          | 0.185    | 0.8693     | 5.5527 | 0.185    | 0.0488   | 0.2481 | 0.7405 |
| No log        | 21.96 | 66   | 3.7932          | 0.185    | 0.8691     | 5.5585 | 0.185    | 0.0488   | 0.2564 | 0.7396 |
| No log        | 22.96 | 69   | 3.7921          | 0.185    | 0.8689     | 5.5607 | 0.185    | 0.0488   | 0.2479 | 0.7391 |
| No log        | 23.96 | 72   | 3.7915          | 0.185    | 0.8688     | 5.6116 | 0.185    | 0.0488   | 0.2523 | 0.7390 |
| No log        | 24.96 | 75   | 3.7912          | 0.185    | 0.8688     | 5.6106 | 0.185    | 0.0488   | 0.2524 | 0.7391 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2