File size: 4,714 Bytes
a666a80
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dit-small_tobacco3482_kd_CEKD_t1.5_a0.5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dit-small_tobacco3482_kd_CEKD_t1.5_a0.5

This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8753
- Accuracy: 0.185
- Brier Loss: 0.8660
- Nll: 6.5533
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2451
- Aurc: 0.7363

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll    | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log        | 0.96  | 3    | 3.1378          | 0.06     | 0.9042     | 9.2898 | 0.06     | 0.0114   | 0.1754 | 0.9032 |
| No log        | 1.96  | 6    | 3.0447          | 0.18     | 0.8884     | 6.2145 | 0.18     | 0.0305   | 0.2294 | 0.8048 |
| No log        | 2.96  | 9    | 2.9500          | 0.18     | 0.8761     | 6.9445 | 0.18     | 0.0305   | 0.2447 | 0.8193 |
| No log        | 3.96  | 12   | 2.9328          | 0.18     | 0.8800     | 6.9512 | 0.18     | 0.0305   | 0.2565 | 0.8122 |
| No log        | 4.96  | 15   | 2.9305          | 0.185    | 0.8793     | 6.9136 | 0.185    | 0.0488   | 0.2557 | 0.7823 |
| No log        | 5.96  | 18   | 2.9286          | 0.185    | 0.8762     | 6.7762 | 0.185    | 0.0488   | 0.2533 | 0.7721 |
| No log        | 6.96  | 21   | 2.9265          | 0.185    | 0.8731     | 5.9902 | 0.185    | 0.0488   | 0.2345 | 0.7682 |
| No log        | 7.96  | 24   | 2.9240          | 0.185    | 0.8718     | 5.9696 | 0.185    | 0.0488   | 0.2625 | 0.7621 |
| No log        | 8.96  | 27   | 2.9177          | 0.185    | 0.8707     | 5.9711 | 0.185    | 0.0488   | 0.2463 | 0.7578 |
| No log        | 9.96  | 30   | 2.9129          | 0.185    | 0.8702     | 6.6932 | 0.185    | 0.0488   | 0.2485 | 0.7574 |
| No log        | 10.96 | 33   | 2.9082          | 0.185    | 0.8704     | 6.7772 | 0.185    | 0.0488   | 0.2500 | 0.7560 |
| No log        | 11.96 | 36   | 2.9039          | 0.185    | 0.8707     | 6.8060 | 0.185    | 0.0488   | 0.2464 | 0.7537 |
| No log        | 12.96 | 39   | 2.8990          | 0.185    | 0.8704     | 6.7988 | 0.185    | 0.0488   | 0.2466 | 0.7515 |
| No log        | 13.96 | 42   | 2.8933          | 0.185    | 0.8696     | 6.7771 | 0.185    | 0.0488   | 0.2505 | 0.7479 |
| No log        | 14.96 | 45   | 2.8879          | 0.185    | 0.8688     | 6.7597 | 0.185    | 0.0488   | 0.2523 | 0.7482 |
| No log        | 15.96 | 48   | 2.8840          | 0.185    | 0.8679     | 6.6825 | 0.185    | 0.0488   | 0.2648 | 0.7454 |
| No log        | 16.96 | 51   | 2.8822          | 0.185    | 0.8676     | 6.6742 | 0.185    | 0.0488   | 0.2473 | 0.7425 |
| No log        | 17.96 | 54   | 2.8819          | 0.185    | 0.8672     | 6.5521 | 0.185    | 0.0488   | 0.2479 | 0.7405 |
| No log        | 18.96 | 57   | 2.8817          | 0.185    | 0.8671     | 6.5498 | 0.185    | 0.0488   | 0.2536 | 0.7385 |
| No log        | 19.96 | 60   | 2.8797          | 0.185    | 0.8667     | 6.5563 | 0.185    | 0.0488   | 0.2442 | 0.7371 |
| No log        | 20.96 | 63   | 2.8784          | 0.185    | 0.8666     | 6.6145 | 0.185    | 0.0488   | 0.2528 | 0.7374 |
| No log        | 21.96 | 66   | 2.8770          | 0.185    | 0.8663     | 6.6084 | 0.185    | 0.0488   | 0.2489 | 0.7366 |
| No log        | 22.96 | 69   | 2.8760          | 0.185    | 0.8662     | 6.5683 | 0.185    | 0.0488   | 0.2448 | 0.7360 |
| No log        | 23.96 | 72   | 2.8756          | 0.185    | 0.8661     | 6.5544 | 0.185    | 0.0488   | 0.2450 | 0.7363 |
| No log        | 24.96 | 75   | 2.8753          | 0.185    | 0.8660     | 6.5533 | 0.185    | 0.0488   | 0.2451 | 0.7363 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2