File size: 9,864 Bytes
e3f67a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
---
license: mit
base_model: roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
model-index:
- name: train
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# train

This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6648
- Accuracy: 0.7617
- B Acc: 0.6394
- Prec: 0.7595
- Recall: 0.7617
- F1: 0.7602
- Prec Joy: 0.7315
- Recall Joy: 0.7793
- F1 Joy: 0.7547
- Prec Anger: 0.6467
- Recall Anger: 0.6507
- F1 Anger: 0.6487
- Prec Disgust: 0.4710
- Recall Disgust: 0.45
- F1 Disgust: 0.4603
- Prec Fear: 0.6963
- Recall Fear: 0.6409
- F1 Fear: 0.6675
- Prec Neutral: 0.8457
- Recall Neutral: 0.8490
- F1 Neutral: 0.8474
- Prec Sadness: 0.7094
- Recall Sadness: 0.6738
- F1 Sadness: 0.6911
- Prec Surprise: 0.5228
- Recall Surprise: 0.4323
- F1 Surprise: 0.4732

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | B Acc  | Prec   | Recall | F1     | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|:--------:|:----------:|:------:|:----------:|:------------:|:--------:|:------------:|:--------------:|:----------:|:---------:|:-----------:|:-------:|:------------:|:--------------:|:----------:|:------------:|:--------------:|:----------:|:-------------:|:---------------:|:-----------:|
| 0.9538        | 0.15  | 232  | 0.8701          | 0.6961   | 0.4790 | 0.6837 | 0.6961 | 0.6837 | 0.7401   | 0.6381     | 0.6853 | 0.4622     | 0.5391       | 0.4977   | 0.25         | 0.0018         | 0.0035     | 0.5527    | 0.4292      | 0.4832  | 0.7965       | 0.8618         | 0.8279     | 0.5281       | 0.6431         | 0.5800     | 0.3562        | 0.2398          | 0.2866      |
| 0.7952        | 0.3   | 464  | 0.8010          | 0.7168   | 0.5242 | 0.7098 | 0.7168 | 0.7025 | 0.8084   | 0.5948     | 0.6853 | 0.5732     | 0.4710       | 0.5171   | 0.4713       | 0.2643         | 0.3387     | 0.6156    | 0.5263      | 0.5675  | 0.7405       | 0.9250         | 0.8226     | 0.6858       | 0.5676         | 0.6211     | 0.4448        | 0.3204          | 0.3725      |
| 0.7528        | 0.45  | 696  | 0.7560          | 0.7261   | 0.5878 | 0.7309 | 0.7261 | 0.7256 | 0.6969   | 0.7646     | 0.7292 | 0.5550     | 0.5534       | 0.5542   | 0.3409       | 0.4821         | 0.3994     | 0.7225    | 0.4842      | 0.5798  | 0.8476       | 0.8159         | 0.8314     | 0.6118       | 0.7027         | 0.6541     | 0.4957        | 0.3118          | 0.3828      |
| 0.7334        | 0.6   | 928  | 0.7310          | 0.7370   | 0.5868 | 0.7345 | 0.7370 | 0.7283 | 0.7170   | 0.7458     | 0.7311 | 0.7129     | 0.4116       | 0.5219   | 0.3727       | 0.5696         | 0.4506     | 0.6671    | 0.5626      | 0.6104  | 0.7898       | 0.8859         | 0.8351     | 0.7318       | 0.5844         | 0.6499     | 0.5252        | 0.3473          | 0.4181      |
| 0.7216        | 0.75  | 1160 | 0.7043          | 0.7448   | 0.6009 | 0.7403 | 0.7448 | 0.7389 | 0.7767   | 0.6826     | 0.7266 | 0.6159     | 0.5386       | 0.5746   | 0.5302       | 0.4393         | 0.4805     | 0.8023    | 0.5602      | 0.6598  | 0.7854       | 0.8926         | 0.8356     | 0.7005       | 0.632          | 0.6645     | 0.4815        | 0.4613          | 0.4712      |
| 0.7259        | 0.9   | 1392 | 0.6962          | 0.7475   | 0.6082 | 0.7433 | 0.7475 | 0.7412 | 0.7355   | 0.7586     | 0.7469 | 0.6758     | 0.4504       | 0.5405   | 0.3908       | 0.5589         | 0.4600     | 0.6939    | 0.6070      | 0.6475  | 0.8122       | 0.8744         | 0.8421     | 0.6830       | 0.6676         | 0.6752     | 0.5494        | 0.3409          | 0.4207      |
| 0.6362        | 1.05  | 1624 | 0.6771          | 0.7526   | 0.6055 | 0.7472 | 0.7526 | 0.7484 | 0.7392   | 0.7483     | 0.7437 | 0.5873     | 0.6191       | 0.6028   | 0.5302       | 0.3768         | 0.4405     | 0.7388    | 0.5789      | 0.6492  | 0.8213       | 0.8670         | 0.8435     | 0.7090       | 0.6507         | 0.6786     | 0.5301        | 0.3978          | 0.4545      |
| 0.621         | 1.2   | 1856 | 0.6779          | 0.7528   | 0.6120 | 0.7494 | 0.7528 | 0.7487 | 0.7107   | 0.7828     | 0.7450 | 0.6508     | 0.5913       | 0.6196   | 0.4980       | 0.4518         | 0.4738     | 0.7963    | 0.5532      | 0.6529  | 0.8165       | 0.8590         | 0.8372     | 0.7499       | 0.6236         | 0.6809     | 0.5078        | 0.4226          | 0.4613      |
| 0.6241        | 1.35  | 2088 | 0.6849          | 0.7513   | 0.6367 | 0.7526 | 0.7513 | 0.7514 | 0.7429   | 0.7592     | 0.7510 | 0.5795     | 0.6531       | 0.6141   | 0.4372       | 0.4661         | 0.4512     | 0.6462    | 0.6515      | 0.6488  | 0.8492       | 0.8372         | 0.8432     | 0.6887       | 0.6609         | 0.6745     | 0.5271        | 0.4290          | 0.4730      |
| 0.6188        | 1.5   | 2320 | 0.6713          | 0.7579   | 0.6159 | 0.7539 | 0.7579 | 0.7534 | 0.7071   | 0.7971     | 0.7494 | 0.6343     | 0.6267       | 0.6305   | 0.5877       | 0.3768         | 0.4592     | 0.7247    | 0.6281      | 0.6729  | 0.8361       | 0.8496         | 0.8428     | 0.6943       | 0.6693         | 0.6816     | 0.5919        | 0.3634          | 0.4504      |
| 0.6182        | 1.65  | 2552 | 0.6608          | 0.7601   | 0.6199 | 0.7567 | 0.7601 | 0.7566 | 0.7143   | 0.7891     | 0.7498 | 0.6163     | 0.6358       | 0.6259   | 0.5607       | 0.3875         | 0.4583     | 0.7591    | 0.6082      | 0.6753  | 0.8375       | 0.8578         | 0.8475     | 0.7324       | 0.6436         | 0.6851     | 0.5381        | 0.4172          | 0.4700      |
| 0.6392        | 1.8   | 2784 | 0.6542          | 0.7624   | 0.6261 | 0.7593 | 0.7624 | 0.7596 | 0.7513   | 0.7584     | 0.7548 | 0.5970     | 0.6708       | 0.6318   | 0.5711       | 0.3875         | 0.4617     | 0.7482    | 0.6152      | 0.6752  | 0.8379       | 0.8635         | 0.8505     | 0.7076       | 0.668          | 0.6872     | 0.5132        | 0.4194          | 0.4615      |
| 0.6158        | 1.95  | 3016 | 0.6456          | 0.7649   | 0.6279 | 0.7599 | 0.7649 | 0.7614 | 0.7490   | 0.7548     | 0.7519 | 0.6402     | 0.6378       | 0.6390   | 0.5314       | 0.4232         | 0.4712     | 0.7569    | 0.6117      | 0.6766  | 0.8310       | 0.8753         | 0.8526     | 0.7199       | 0.6627         | 0.6901     | 0.5063        | 0.4301          | 0.4651      |
| 0.554         | 2.1   | 3248 | 0.6742          | 0.7584   | 0.6346 | 0.7555 | 0.7584 | 0.7564 | 0.7293   | 0.7732     | 0.7506 | 0.6433     | 0.6430       | 0.6432   | 0.5031       | 0.4393         | 0.4690     | 0.7292    | 0.6363      | 0.6796  | 0.8347       | 0.8496         | 0.8421     | 0.7163       | 0.6587         | 0.6863     | 0.5049        | 0.4419          | 0.4713      |
| 0.5537        | 2.25  | 3480 | 0.6708          | 0.7633   | 0.6283 | 0.7604 | 0.7633 | 0.7605 | 0.7263   | 0.7801     | 0.7523 | 0.6304     | 0.6612       | 0.6455   | 0.5806       | 0.3732         | 0.4543     | 0.7486    | 0.6094      | 0.6718  | 0.8442       | 0.8528         | 0.8485     | 0.6982       | 0.692          | 0.6951     | 0.5356        | 0.4290          | 0.4764      |
| 0.5375        | 2.4   | 3712 | 0.6712          | 0.7606   | 0.6402 | 0.7592 | 0.7606 | 0.7595 | 0.7373   | 0.7709     | 0.7537 | 0.6245     | 0.6608       | 0.6421   | 0.4827       | 0.4482         | 0.4648     | 0.7319    | 0.6257      | 0.6747  | 0.8454       | 0.8474         | 0.8464     | 0.7006       | 0.6769         | 0.6885     | 0.5204        | 0.4516          | 0.4836      |
| 0.5175        | 2.55  | 3944 | 0.6625          | 0.7625   | 0.6369 | 0.7600 | 0.7625 | 0.7604 | 0.7422   | 0.7642     | 0.7530 | 0.6335     | 0.6526       | 0.6429   | 0.4481       | 0.4929         | 0.4694     | 0.7482    | 0.6187      | 0.6773  | 0.8374       | 0.8604         | 0.8488     | 0.7252       | 0.6684         | 0.6957     | 0.5321        | 0.4011          | 0.4574      |
| 0.5182        | 2.7   | 4176 | 0.6621          | 0.7631   | 0.6404 | 0.7602 | 0.7631 | 0.7612 | 0.7343   | 0.7766     | 0.7549 | 0.6491     | 0.6392       | 0.6441   | 0.4739       | 0.4536         | 0.4635     | 0.6784    | 0.6538      | 0.6659  | 0.8444       | 0.8529         | 0.8486     | 0.7109       | 0.684          | 0.6972     | 0.5458        | 0.4226          | 0.4764      |
| 0.5148        | 2.85  | 4408 | 0.6638          | 0.7637   | 0.6383 | 0.7598 | 0.7637 | 0.7612 | 0.7394   | 0.7741     | 0.7563 | 0.6741     | 0.6205       | 0.6462   | 0.5          | 0.4375         | 0.4667     | 0.6813    | 0.6550      | 0.6679  | 0.8400       | 0.8572         | 0.8485     | 0.6922       | 0.6916         | 0.6919     | 0.5296        | 0.4323          | 0.4760      |


### Framework versions

- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.2
- Tokenizers 0.13.3