jordyvl commited on
Commit
002f023
1 Parent(s): b0a5d1a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +91 -0
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - accuracy
6
+ model-index:
7
+ - name: dit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # dit-tiny_rvl_cdip_100_examples_per_class_simkd_CEKD_t1_aNone
15
+
16
+ This model is a fine-tuned version of [microsoft/dit-base](https://huggingface.co/microsoft/dit-base) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.1502
19
+ - Accuracy: 0.0625
20
+ - Brier Loss: 0.9374
21
+ - Nll: 9.1398
22
+ - F1 Micro: 0.0625
23
+ - F1 Macro: 0.0074
24
+ - Ece: 0.1015
25
+ - Aurc: 0.9383
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 4
46
+ - eval_batch_size: 4
47
+ - seed: 42
48
+ - gradient_accumulation_steps: 16
49
+ - total_train_batch_size: 64
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_ratio: 0.1
53
+ - num_epochs: 25
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
59
+ | No log | 0.96 | 12 | 0.1540 | 0.0625 | 0.9376 | 8.5438 | 0.0625 | 0.0074 | 0.1043 | 0.9530 |
60
+ | No log | 1.96 | 24 | 0.1519 | 0.0625 | 0.9376 | 8.2831 | 0.0625 | 0.0074 | 0.1008 | 0.9465 |
61
+ | No log | 2.96 | 36 | 0.1512 | 0.0625 | 0.9375 | 8.4629 | 0.0625 | 0.0074 | 0.1028 | 0.9336 |
62
+ | No log | 3.96 | 48 | 0.1510 | 0.0625 | 0.9375 | 8.6283 | 0.0625 | 0.0074 | 0.1027 | 0.9365 |
63
+ | No log | 4.96 | 60 | 0.1509 | 0.0625 | 0.9375 | 8.5065 | 0.0625 | 0.0074 | 0.1030 | 0.9433 |
64
+ | No log | 5.96 | 72 | 0.1508 | 0.0625 | 0.9375 | 8.4779 | 0.0625 | 0.0074 | 0.1017 | 0.9414 |
65
+ | No log | 6.96 | 84 | 0.1507 | 0.0625 | 0.9375 | 8.5053 | 0.0625 | 0.0074 | 0.1045 | 0.9438 |
66
+ | No log | 7.96 | 96 | 0.1507 | 0.0625 | 0.9375 | 8.7396 | 0.0625 | 0.0074 | 0.1032 | 0.9440 |
67
+ | No log | 8.96 | 108 | 0.1506 | 0.0625 | 0.9375 | 8.6420 | 0.0625 | 0.0074 | 0.1031 | 0.9448 |
68
+ | No log | 9.96 | 120 | 0.1506 | 0.0625 | 0.9375 | 8.8410 | 0.0625 | 0.0074 | 0.1045 | 0.9438 |
69
+ | No log | 10.96 | 132 | 0.1506 | 0.0625 | 0.9374 | 8.9438 | 0.0625 | 0.0074 | 0.1042 | 0.9413 |
70
+ | No log | 11.96 | 144 | 0.1505 | 0.0625 | 0.9374 | 8.9847 | 0.0625 | 0.0074 | 0.1032 | 0.9418 |
71
+ | No log | 12.96 | 156 | 0.1505 | 0.0625 | 0.9374 | 9.0594 | 0.0625 | 0.0074 | 0.1031 | 0.9397 |
72
+ | No log | 13.96 | 168 | 0.1504 | 0.0625 | 0.9374 | 9.0748 | 0.0625 | 0.0074 | 0.1045 | 0.9343 |
73
+ | No log | 14.96 | 180 | 0.1504 | 0.0625 | 0.9374 | 9.0912 | 0.0625 | 0.0074 | 0.1018 | 0.9358 |
74
+ | No log | 15.96 | 192 | 0.1504 | 0.0625 | 0.9374 | 9.0950 | 0.0625 | 0.0074 | 0.1032 | 0.9331 |
75
+ | No log | 16.96 | 204 | 0.1503 | 0.0625 | 0.9374 | 9.2141 | 0.0625 | 0.0074 | 0.1015 | 0.9363 |
76
+ | No log | 17.96 | 216 | 0.1503 | 0.0625 | 0.9374 | 9.0918 | 0.0625 | 0.0074 | 0.1046 | 0.9354 |
77
+ | No log | 18.96 | 228 | 0.1503 | 0.0625 | 0.9374 | 9.1430 | 0.0625 | 0.0074 | 0.1018 | 0.9385 |
78
+ | No log | 19.96 | 240 | 0.1503 | 0.0625 | 0.9374 | 9.2149 | 0.0625 | 0.0074 | 0.0991 | 0.9404 |
79
+ | No log | 20.96 | 252 | 0.1503 | 0.0625 | 0.9374 | 9.0900 | 0.0625 | 0.0074 | 0.1043 | 0.9386 |
80
+ | No log | 21.96 | 264 | 0.1503 | 0.0625 | 0.9374 | 9.1244 | 0.0625 | 0.0074 | 0.1060 | 0.9395 |
81
+ | No log | 22.96 | 276 | 0.1503 | 0.0625 | 0.9374 | 9.1353 | 0.0625 | 0.0074 | 0.1005 | 0.9378 |
82
+ | No log | 23.96 | 288 | 0.1502 | 0.0625 | 0.9374 | 9.2063 | 0.0625 | 0.0074 | 0.1032 | 0.9373 |
83
+ | No log | 24.96 | 300 | 0.1502 | 0.0625 | 0.9374 | 9.1398 | 0.0625 | 0.0074 | 0.1015 | 0.9383 |
84
+
85
+
86
+ ### Framework versions
87
+
88
+ - Transformers 4.26.1
89
+ - Pytorch 1.13.1.post200
90
+ - Datasets 2.9.0
91
+ - Tokenizers 0.13.2