erlend123 commited on
Commit
c959076
1 Parent(s): 5ec002a

Model save

Browse files
Files changed (2) hide show
  1. README.md +17 -0
  2. model.safetensors +1 -1
README.md CHANGED
@@ -1,6 +1,9 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
 
 
 
4
  model-index:
5
  - name: emotion-analysis-trans
6
  results: []
@@ -12,6 +15,10 @@ should probably proofread and complete it, then remove this comment. -->
12
  # emotion-analysis-trans
13
 
14
  This model was trained from scratch on an unknown dataset.
 
 
 
 
15
 
16
  ## Model description
17
 
@@ -38,6 +45,16 @@ The following hyperparameters were used during training:
38
  - lr_scheduler_type: linear
39
  - num_epochs: 5
40
 
 
 
 
 
 
 
 
 
 
 
41
  ### Framework versions
42
 
43
  - Transformers 4.39.3
 
1
  ---
2
  tags:
3
  - generated_from_trainer
4
+ metrics:
5
+ - accuracy
6
+ - f1
7
  model-index:
8
  - name: emotion-analysis-trans
9
  results: []
 
15
  # emotion-analysis-trans
16
 
17
  This model was trained from scratch on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.0950
20
+ - Accuracy: 0.9429
21
+ - F1: 0.9433
22
 
23
  ## Model description
24
 
 
45
  - lr_scheduler_type: linear
46
  - num_epochs: 5
47
 
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
51
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
52
+ | 0.0921 | 1.0 | 5211 | 0.0895 | 0.9418 | 0.9398 |
53
+ | 0.0868 | 2.0 | 10422 | 0.0868 | 0.9420 | 0.9407 |
54
+ | 0.0824 | 3.0 | 15633 | 0.0896 | 0.9425 | 0.9439 |
55
+ | 0.0791 | 4.0 | 20844 | 0.0950 | 0.9429 | 0.9433 |
56
+
57
+
58
  ### Framework versions
59
 
60
  - Transformers 4.39.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e7e143d461e57aeabb7b5998d308cbc9b1266d15641d91f38b5c10252f4c69e2
3
  size 267844872
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:12088eab4528fbd817ce68e3b5caea958bc600f82178b83ecf465f3c72aebb9f
3
  size 267844872