File size: 4,714 Bytes
8a5d0ca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dit-small_tobacco3482_kd_CEKD_t1.5_a0.7
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dit-small_tobacco3482_kd_CEKD_t1.5_a0.7

This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5836
- Accuracy: 0.185
- Brier Loss: 0.8652
- Nll: 6.4546
- F1 Micro: 0.185
- F1 Macro: 0.0488
- Ece: 0.2424
- Aurc: 0.7342

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll    | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log        | 0.96  | 3    | 2.8093          | 0.06     | 0.9041     | 9.2868 | 0.06     | 0.0114   | 0.1752 | 0.9033 |
| No log        | 1.96  | 6    | 2.7245          | 0.18     | 0.8884     | 6.2166 | 0.18     | 0.0305   | 0.2292 | 0.8036 |
| No log        | 2.96  | 9    | 2.6443          | 0.18     | 0.8760     | 6.9627 | 0.18     | 0.0305   | 0.2437 | 0.8179 |
| No log        | 3.96  | 12   | 2.6356          | 0.185    | 0.8785     | 6.9306 | 0.185    | 0.0488   | 0.2534 | 0.7877 |
| No log        | 4.96  | 15   | 2.6338          | 0.185    | 0.8768     | 6.8870 | 0.185    | 0.0488   | 0.2605 | 0.7787 |
| No log        | 5.96  | 18   | 2.6325          | 0.185    | 0.8740     | 6.2086 | 0.185    | 0.0490   | 0.2453 | 0.7699 |
| No log        | 6.96  | 21   | 2.6322          | 0.185    | 0.8721     | 5.9554 | 0.185    | 0.0488   | 0.2474 | 0.7629 |
| No log        | 7.96  | 24   | 2.6293          | 0.185    | 0.8712     | 5.9359 | 0.185    | 0.0488   | 0.2550 | 0.7576 |
| No log        | 8.96  | 27   | 2.6221          | 0.185    | 0.8701     | 5.9468 | 0.185    | 0.0488   | 0.2436 | 0.7536 |
| No log        | 9.96  | 30   | 2.6171          | 0.185    | 0.8697     | 6.6875 | 0.185    | 0.0488   | 0.2497 | 0.7541 |
| No log        | 10.96 | 33   | 2.6126          | 0.185    | 0.8697     | 6.7549 | 0.185    | 0.0488   | 0.2512 | 0.7517 |
| No log        | 11.96 | 36   | 2.6084          | 0.185    | 0.8697     | 6.7827 | 0.185    | 0.0488   | 0.2476 | 0.7489 |
| No log        | 12.96 | 39   | 2.6037          | 0.185    | 0.8692     | 6.7652 | 0.185    | 0.0488   | 0.2557 | 0.7476 |
| No log        | 13.96 | 42   | 2.5986          | 0.185    | 0.8683     | 6.6847 | 0.185    | 0.0488   | 0.2513 | 0.7446 |
| No log        | 14.96 | 45   | 2.5940          | 0.185    | 0.8676     | 6.6600 | 0.185    | 0.0488   | 0.2572 | 0.7447 |
| No log        | 15.96 | 48   | 2.5910          | 0.185    | 0.8669     | 6.6410 | 0.185    | 0.0488   | 0.2448 | 0.7424 |
| No log        | 16.96 | 51   | 2.5897          | 0.185    | 0.8667     | 6.6371 | 0.185    | 0.0488   | 0.2402 | 0.7402 |
| No log        | 17.96 | 54   | 2.5898          | 0.185    | 0.8664     | 6.5096 | 0.185    | 0.0488   | 0.2549 | 0.7371 |
| No log        | 18.96 | 57   | 2.5897          | 0.185    | 0.8664     | 6.5160 | 0.185    | 0.0488   | 0.2504 | 0.7363 |
| No log        | 19.96 | 60   | 2.5877          | 0.185    | 0.8660     | 6.4661 | 0.185    | 0.0488   | 0.2416 | 0.7346 |
| No log        | 20.96 | 63   | 2.5865          | 0.185    | 0.8658     | 6.4833 | 0.185    | 0.0488   | 0.2459 | 0.7347 |
| No log        | 21.96 | 66   | 2.5852          | 0.185    | 0.8655     | 6.4690 | 0.185    | 0.0488   | 0.2460 | 0.7343 |
| No log        | 22.96 | 69   | 2.5843          | 0.185    | 0.8654     | 6.4625 | 0.185    | 0.0488   | 0.2461 | 0.7340 |
| No log        | 23.96 | 72   | 2.5838          | 0.185    | 0.8653     | 6.4568 | 0.185    | 0.0488   | 0.2424 | 0.7342 |
| No log        | 24.96 | 75   | 2.5836          | 0.185    | 0.8652     | 6.4546 | 0.185    | 0.0488   | 0.2424 | 0.7342 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2