File size: 4,737 Bytes
1cb60b1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.9
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dit-tiny_tobacco3482_kd_CEKD_t1.5_a0.9

This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3286
- Accuracy: 0.18
- Brier Loss: 0.8742
- Nll: 6.7213
- F1 Micro: 0.18
- F1 Macro: 0.0306
- Ece: 0.2558
- Aurc: 0.8491

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll     | F1 Micro | F1 Macro | Ece    | Aurc   |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
| No log        | 0.96  | 3    | 2.4683          | 0.145    | 0.8999     | 10.1538 | 0.145    | 0.0253   | 0.2220 | 0.8466 |
| No log        | 1.96  | 6    | 2.4396          | 0.145    | 0.8947     | 10.5704 | 0.145    | 0.0253   | 0.2237 | 0.8463 |
| No log        | 2.96  | 9    | 2.3985          | 0.145    | 0.8869     | 8.5511  | 0.145    | 0.0451   | 0.2116 | 0.8036 |
| No log        | 3.96  | 12   | 2.3677          | 0.21     | 0.8810     | 6.5446  | 0.2100   | 0.0611   | 0.2566 | 0.8335 |
| No log        | 4.96  | 15   | 2.3517          | 0.155    | 0.8780     | 6.8400  | 0.155    | 0.0279   | 0.2309 | 0.8894 |
| No log        | 5.96  | 18   | 2.3450          | 0.18     | 0.8771     | 8.1897  | 0.18     | 0.0313   | 0.2495 | 0.8531 |
| No log        | 6.96  | 21   | 2.3407          | 0.18     | 0.8767     | 7.3073  | 0.18     | 0.0306   | 0.2551 | 0.8513 |
| No log        | 7.96  | 24   | 2.3371          | 0.18     | 0.8763     | 6.9328  | 0.18     | 0.0306   | 0.2501 | 0.8520 |
| No log        | 8.96  | 27   | 2.3337          | 0.18     | 0.8757     | 6.8828  | 0.18     | 0.0306   | 0.2507 | 0.8525 |
| No log        | 9.96  | 30   | 2.3321          | 0.18     | 0.8753     | 6.8682  | 0.18     | 0.0306   | 0.2508 | 0.8524 |
| No log        | 10.96 | 33   | 2.3312          | 0.18     | 0.8751     | 6.7981  | 0.18     | 0.0306   | 0.2462 | 0.8521 |
| No log        | 11.96 | 36   | 2.3309          | 0.18     | 0.8749     | 6.7375  | 0.18     | 0.0306   | 0.2531 | 0.8520 |
| No log        | 12.96 | 39   | 2.3307          | 0.18     | 0.8748     | 6.7235  | 0.18     | 0.0306   | 0.2524 | 0.8518 |
| No log        | 13.96 | 42   | 2.3304          | 0.18     | 0.8747     | 6.7200  | 0.18     | 0.0306   | 0.2482 | 0.8514 |
| No log        | 14.96 | 45   | 2.3301          | 0.18     | 0.8746     | 6.7201  | 0.18     | 0.0306   | 0.2410 | 0.8509 |
| No log        | 15.96 | 48   | 2.3298          | 0.18     | 0.8746     | 6.7182  | 0.18     | 0.0306   | 0.2449 | 0.8505 |
| No log        | 16.96 | 51   | 2.3295          | 0.18     | 0.8745     | 6.7211  | 0.18     | 0.0306   | 0.2412 | 0.8500 |
| No log        | 17.96 | 54   | 2.3297          | 0.18     | 0.8745     | 6.7201  | 0.18     | 0.0306   | 0.2449 | 0.8496 |
| No log        | 18.96 | 57   | 2.3296          | 0.18     | 0.8745     | 6.7216  | 0.18     | 0.0306   | 0.2392 | 0.8494 |
| No log        | 19.96 | 60   | 2.3292          | 0.18     | 0.8744     | 6.7214  | 0.18     | 0.0306   | 0.2371 | 0.8494 |
| No log        | 20.96 | 63   | 2.3290          | 0.18     | 0.8744     | 6.7222  | 0.18     | 0.0306   | 0.2371 | 0.8493 |
| No log        | 21.96 | 66   | 2.3288          | 0.18     | 0.8743     | 6.7227  | 0.18     | 0.0306   | 0.2408 | 0.8494 |
| No log        | 22.96 | 69   | 2.3286          | 0.18     | 0.8743     | 6.7223  | 0.18     | 0.0306   | 0.2558 | 0.8490 |
| No log        | 23.96 | 72   | 2.3286          | 0.18     | 0.8743     | 6.7218  | 0.18     | 0.0306   | 0.2558 | 0.8491 |
| No log        | 24.96 | 75   | 2.3286          | 0.18     | 0.8742     | 6.7213  | 0.18     | 0.0306   | 0.2558 | 0.8491 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1.post200
- Datasets 2.9.0
- Tokenizers 0.13.2