bergum commited on
Commit
0c513fe
1 Parent(s): c122a6b

Update readme with accuracy

Browse files
Files changed (1) hide show
  1. README.md +66 -0
README.md ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - emotion
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: xtremedistil-emotion
11
+ results:
12
+ - task:
13
+ name: Text Classification
14
+ type: text-classification
15
+ dataset:
16
+ name: emotion
17
+ type: emotion
18
+ args: default
19
+ metrics:
20
+ - name: Accuracy
21
+ type: accuracy
22
+ value: 0.9265
23
+ ---
24
+
25
+ # xtremedistil-emotion
26
+ This model is a fine-tuned version of [microsoft/xtremedistil-l6-h256-uncased](https://huggingface.co/microsoft/xtremedistil-l6-h256-uncased) on the emotion dataset.
27
+ It achieves the following results on the evaluation set:
28
+ - Accuracy: 0.9265
29
+
30
+
31
+ ### Training hyperparameters
32
+ The following hyperparameters were used during training:
33
+ - learning_rate: 3e-05
34
+ - train_batch_size: 128
35
+ - eval_batch_size: 8
36
+ - seed: 42
37
+ - num_epochs: 24
38
+
39
+ ### Training results
40
+ <pre>
41
+ Epoch Training Loss Validation Loss Accuracy
42
+ 1 No log 1.238589 0.609000
43
+ 2 No log 0.934423 0.714000
44
+ 3 No log 0.768701 0.742000
45
+ 4 1.074800 0.638208 0.805500
46
+ 5 1.074800 0.551363 0.851500
47
+ 6 1.074800 0.476291 0.875500
48
+ 7 1.074800 0.427313 0.883500
49
+ 8 0.531500 0.392633 0.886000
50
+ 9 0.531500 0.357979 0.892000
51
+ 10 0.531500 0.330304 0.899500
52
+ 11 0.531500 0.304529 0.907000
53
+ 12 0.337200 0.287447 0.918000
54
+ 13 0.337200 0.277067 0.921000
55
+ 14 0.337200 0.259483 0.921000
56
+ 15 0.337200 0.257564 0.916500
57
+ 16 0.246200 0.241970 0.919500
58
+ 17 0.246200 0.241537 0.921500
59
+ 18 0.246200 0.235705 0.924500
60
+ 19 0.246200 0.237325 0.920500
61
+ 20 0.201400 0.229699 0.923500
62
+ 21 0.201400 0.227426 0.923000
63
+ 22 0.201400 0.228554 0.924000
64
+ 23 0.201400 0.226941 0.925500
65
+ 24 0.184300 0.225816 0.926500
66
+ </pre>