File size: 4,715 Bytes
a74942a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dit-small_tobacco3482_simkd_CEKD_t1_aNone
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dit-small_tobacco3482_simkd_CEKD_t1_aNone

This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9876
- Accuracy: 0.085
- Brier Loss: 0.8927
- Nll: 8.3272
- F1 Micro: 0.085
- F1 Macro: 0.0461
- Ece: 0.1645
- Aurc: 0.7988

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll    | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log        | 0.96  | 12   | 1.0049          | 0.08     | 0.8993     | 5.4663 | 0.08     | 0.0322   | 0.1476 | 0.8883 |
| No log        | 1.96  | 24   | 1.0007          | 0.165    | 0.8988     | 5.5926 | 0.165    | 0.0284   | 0.2066 | 0.8251 |
| No log        | 2.96  | 36   | 0.9994          | 0.16     | 0.8982     | 5.9135 | 0.16     | 0.0277   | 0.2100 | 0.8518 |
| No log        | 3.96  | 48   | 0.9984          | 0.17     | 0.8975     | 6.1195 | 0.17     | 0.0574   | 0.2142 | 0.8153 |
| No log        | 4.96  | 60   | 0.9976          | 0.19     | 0.8970     | 6.2724 | 0.19     | 0.0752   | 0.2294 | 0.8254 |
| No log        | 5.96  | 72   | 0.9967          | 0.09     | 0.8968     | 6.3787 | 0.09     | 0.0315   | 0.1591 | 0.7950 |
| No log        | 6.96  | 84   | 0.9958          | 0.065    | 0.8964     | 6.4218 | 0.065    | 0.0122   | 0.1433 | 0.8333 |
| No log        | 7.96  | 96   | 0.9949          | 0.065    | 0.8960     | 6.5170 | 0.065    | 0.0122   | 0.1543 | 0.8344 |
| No log        | 8.96  | 108  | 0.9941          | 0.065    | 0.8956     | 6.5572 | 0.065    | 0.0123   | 0.1545 | 0.8331 |
| No log        | 9.96  | 120  | 0.9934          | 0.07     | 0.8954     | 6.6362 | 0.07     | 0.0304   | 0.1597 | 0.8313 |
| No log        | 10.96 | 132  | 0.9926          | 0.07     | 0.8951     | 6.6430 | 0.07     | 0.0304   | 0.1576 | 0.8325 |
| No log        | 11.96 | 144  | 0.9920          | 0.07     | 0.8948     | 6.6842 | 0.07     | 0.0304   | 0.1590 | 0.8225 |
| No log        | 12.96 | 156  | 0.9914          | 0.07     | 0.8947     | 6.7731 | 0.07     | 0.0304   | 0.1619 | 0.8155 |
| No log        | 13.96 | 168  | 0.9909          | 0.07     | 0.8944     | 6.8584 | 0.07     | 0.0304   | 0.1522 | 0.8128 |
| No log        | 14.96 | 180  | 0.9904          | 0.07     | 0.8941     | 6.8161 | 0.07     | 0.0304   | 0.1524 | 0.8142 |
| No log        | 15.96 | 192  | 0.9899          | 0.07     | 0.8940     | 7.3169 | 0.07     | 0.0304   | 0.1532 | 0.8109 |
| No log        | 16.96 | 204  | 0.9894          | 0.07     | 0.8937     | 7.8481 | 0.07     | 0.0304   | 0.1531 | 0.8132 |
| No log        | 17.96 | 216  | 0.9890          | 0.08     | 0.8935     | 8.3375 | 0.08     | 0.0439   | 0.1587 | 0.8002 |
| No log        | 18.96 | 228  | 0.9886          | 0.07     | 0.8933     | 8.4250 | 0.07     | 0.0307   | 0.1536 | 0.8132 |
| No log        | 19.96 | 240  | 0.9883          | 0.085    | 0.8931     | 8.4316 | 0.085    | 0.0445   | 0.1618 | 0.8014 |
| No log        | 20.96 | 252  | 0.9880          | 0.075    | 0.8930     | 8.4395 | 0.075    | 0.0392   | 0.1566 | 0.8088 |
| No log        | 21.96 | 264  | 0.9878          | 0.085    | 0.8929     | 8.3319 | 0.085    | 0.0476   | 0.1621 | 0.7956 |
| No log        | 22.96 | 276  | 0.9877          | 0.08     | 0.8928     | 8.3274 | 0.08     | 0.0439   | 0.1594 | 0.8024 |
| No log        | 23.96 | 288  | 0.9876          | 0.08     | 0.8927     | 8.3285 | 0.08     | 0.0440   | 0.1595 | 0.8014 |
| No log        | 24.96 | 300  | 0.9876          | 0.085    | 0.8927     | 8.3272 | 0.085    | 0.0461   | 0.1645 | 0.7988 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2