jordyvl commited on
Commit
9b1ea80
1 Parent(s): a2219e1

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +91 -0
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - accuracy
6
+ model-index:
7
+ - name: dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # dit-tiny_tobacco3482_kd_CEKD_t2.5_a0.9
15
+
16
+ This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 2.5379
19
+ - Accuracy: 0.18
20
+ - Brier Loss: 0.8746
21
+ - Nll: 6.7389
22
+ - F1 Micro: 0.18
23
+ - F1 Macro: 0.0306
24
+ - Ece: 0.2460
25
+ - Aurc: 0.8496
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - gradient_accumulation_steps: 16
49
+ - total_train_batch_size: 256
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_ratio: 0.1
53
+ - num_epochs: 25
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
59
+ | No log | 0.96 | 3 | 2.6891 | 0.145 | 0.8999 | 10.1550 | 0.145 | 0.0253 | 0.2220 | 0.8466 |
60
+ | No log | 1.96 | 6 | 2.6592 | 0.145 | 0.8947 | 10.5706 | 0.145 | 0.0253 | 0.2238 | 0.8463 |
61
+ | No log | 2.96 | 9 | 2.6158 | 0.14 | 0.8869 | 8.5528 | 0.14 | 0.0422 | 0.2066 | 0.8175 |
62
+ | No log | 3.96 | 12 | 2.5827 | 0.175 | 0.8810 | 6.5464 | 0.175 | 0.0467 | 0.2385 | 0.8661 |
63
+ | No log | 4.96 | 15 | 2.5647 | 0.155 | 0.8781 | 6.8570 | 0.155 | 0.0274 | 0.2316 | 0.8886 |
64
+ | No log | 5.96 | 18 | 2.5566 | 0.19 | 0.8772 | 8.4283 | 0.19 | 0.0413 | 0.2460 | 0.8532 |
65
+ | No log | 6.96 | 21 | 2.5515 | 0.18 | 0.8769 | 7.6865 | 0.18 | 0.0308 | 0.2480 | 0.8517 |
66
+ | No log | 7.96 | 24 | 2.5475 | 0.18 | 0.8767 | 6.9727 | 0.18 | 0.0306 | 0.2469 | 0.8521 |
67
+ | No log | 8.96 | 27 | 2.5438 | 0.18 | 0.8762 | 6.9080 | 0.18 | 0.0306 | 0.2438 | 0.8525 |
68
+ | No log | 9.96 | 30 | 2.5420 | 0.18 | 0.8758 | 6.8906 | 0.18 | 0.0306 | 0.2521 | 0.8528 |
69
+ | No log | 10.96 | 33 | 2.5410 | 0.18 | 0.8755 | 6.8317 | 0.18 | 0.0306 | 0.2516 | 0.8524 |
70
+ | No log | 11.96 | 36 | 2.5404 | 0.18 | 0.8753 | 6.7606 | 0.18 | 0.0306 | 0.2469 | 0.8516 |
71
+ | No log | 12.96 | 39 | 2.5401 | 0.18 | 0.8752 | 6.7444 | 0.18 | 0.0306 | 0.2425 | 0.8516 |
72
+ | No log | 13.96 | 42 | 2.5397 | 0.18 | 0.8751 | 6.7397 | 0.18 | 0.0306 | 0.2498 | 0.8514 |
73
+ | No log | 14.96 | 45 | 2.5393 | 0.18 | 0.8750 | 6.7390 | 0.18 | 0.0306 | 0.2579 | 0.8511 |
74
+ | No log | 15.96 | 48 | 2.5389 | 0.18 | 0.8749 | 6.7366 | 0.18 | 0.0306 | 0.2463 | 0.8513 |
75
+ | No log | 16.96 | 51 | 2.5387 | 0.18 | 0.8749 | 6.7390 | 0.18 | 0.0306 | 0.2465 | 0.8510 |
76
+ | No log | 17.96 | 54 | 2.5389 | 0.18 | 0.8749 | 6.7382 | 0.18 | 0.0306 | 0.2425 | 0.8505 |
77
+ | No log | 18.96 | 57 | 2.5389 | 0.18 | 0.8749 | 6.7397 | 0.18 | 0.0306 | 0.2463 | 0.8504 |
78
+ | No log | 19.96 | 60 | 2.5384 | 0.18 | 0.8748 | 6.7391 | 0.18 | 0.0306 | 0.2421 | 0.8495 |
79
+ | No log | 20.96 | 63 | 2.5383 | 0.18 | 0.8747 | 6.7396 | 0.18 | 0.0306 | 0.2422 | 0.8500 |
80
+ | No log | 21.96 | 66 | 2.5380 | 0.18 | 0.8747 | 6.7399 | 0.18 | 0.0306 | 0.2460 | 0.8496 |
81
+ | No log | 22.96 | 69 | 2.5379 | 0.18 | 0.8746 | 6.7395 | 0.18 | 0.0306 | 0.2460 | 0.8497 |
82
+ | No log | 23.96 | 72 | 2.5379 | 0.18 | 0.8746 | 6.7393 | 0.18 | 0.0306 | 0.2460 | 0.8497 |
83
+ | No log | 24.96 | 75 | 2.5379 | 0.18 | 0.8746 | 6.7389 | 0.18 | 0.0306 | 0.2460 | 0.8496 |
84
+
85
+
86
+ ### Framework versions
87
+
88
+ - Transformers 4.26.1
89
+ - Pytorch 1.13.1.post200
90
+ - Datasets 2.9.0
91
+ - Tokenizers 0.13.2