File size: 10,641 Bytes
f598d00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e481fc9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f598d00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e481fc9
f598d00
 
 
 
 
 
e481fc9
f598d00
 
 
 
 
e481fc9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f598d00
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
---
license: mit
base_model: roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
model-index:
- name: lora-roberta-large-no-ed
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# lora-roberta-large-no-ed

This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6945
- Accuracy: 0.7549
- Prec: 0.6320
- Recall: 0.6106
- F1: 0.6199
- B Acc: 0.6106
- Micro F1: 0.7549
- Prec Joy: 0.7138
- Recall Joy: 0.7631
- F1 Joy: 0.7376
- Prec Anger: 0.6130
- Recall Anger: 0.6316
- F1 Anger: 0.6222
- Prec Disgust: 0.4396
- Recall Disgust: 0.4038
- F1 Disgust: 0.4209
- Prec Fear: 0.6623
- Recall Fear: 0.5886
- F1 Fear: 0.6233
- Prec Neutral: 0.8481
- Recall Neutral: 0.8428
- F1 Neutral: 0.8455
- Prec Sadness: 0.6626
- Recall Sadness: 0.6619
- F1 Sadness: 0.6622
- Prec Surprise: 0.4846
- Recall Surprise: 0.3824
- F1 Surprise: 0.4274

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 20.0

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Prec   | Recall | F1     | B Acc  | Micro F1 | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|:--------:|:--------:|:----------:|:------:|:----------:|:------------:|:--------:|:------------:|:--------------:|:----------:|:---------:|:-----------:|:-------:|:------------:|:--------------:|:----------:|:------------:|:--------------:|:----------:|:-------------:|:---------------:|:-----------:|
| 0.7938        | 1.0   | 1465  | 0.7589          | 0.7257   | 0.6233 | 0.4993 | 0.5433 | 0.4993 | 0.7257   | 0.7259   | 0.6828     | 0.7037 | 0.6223     | 0.4082       | 0.4930   | 0.5359       | 0.2657         | 0.3552     | 0.5925    | 0.5110      | 0.5487  | 0.7564       | 0.9097         | 0.8260     | 0.7150       | 0.4865         | 0.5790     | 0.4151        | 0.2315          | 0.2972      |
| 0.7546        | 2.0   | 2930  | 0.7482          | 0.7243   | 0.6272 | 0.5499 | 0.5735 | 0.5499 | 0.7243   | 0.6028   | 0.8315     | 0.6989 | 0.5325     | 0.5802       | 0.5553   | 0.5135       | 0.2782         | 0.3609     | 0.6619    | 0.5388      | 0.5940  | 0.8498       | 0.8045         | 0.8265     | 0.74         | 0.5294         | 0.6172     | 0.4902        | 0.2864          | 0.3616      |
| 0.7289        | 3.0   | 4395  | 0.7293          | 0.7321   | 0.6234 | 0.5839 | 0.5984 | 0.5839 | 0.7321   | 0.6491   | 0.7901     | 0.7127 | 0.6129     | 0.5271       | 0.5668   | 0.4413       | 0.4561         | 0.4486     | 0.6974    | 0.5198      | 0.5956  | 0.8364       | 0.8146         | 0.8254     | 0.6406       | 0.6423         | 0.6414     | 0.4862        | 0.3376          | 0.3985      |
| 0.7076        | 4.0   | 5860  | 0.6898          | 0.7466   | 0.6572 | 0.5649 | 0.5972 | 0.5649 | 0.7466   | 0.7573   | 0.6911     | 0.7227 | 0.5110     | 0.6565       | 0.5747   | 0.4868       | 0.3096         | 0.3785     | 0.8139    | 0.4802      | 0.6041  | 0.8125       | 0.8772         | 0.8436     | 0.6939       | 0.5946         | 0.6404     | 0.5253        | 0.3453          | 0.4167      |
| 0.6925        | 5.0   | 7325  | 0.7039          | 0.7403   | 0.6121 | 0.5916 | 0.5972 | 0.5916 | 0.7403   | 0.6933   | 0.7525     | 0.7217 | 0.5234     | 0.6372       | 0.5747   | 0.3630       | 0.4100         | 0.3851     | 0.6121    | 0.5798      | 0.5955  | 0.8446       | 0.8363         | 0.8404     | 0.7512       | 0.5713         | 0.6490     | 0.4973        | 0.3542          | 0.4137      |
| 0.6841        | 6.0   | 8790  | 0.6704          | 0.7516   | 0.6607 | 0.5820 | 0.6076 | 0.5820 | 0.7516   | 0.7158   | 0.7536     | 0.7342 | 0.6577     | 0.4856       | 0.5587   | 0.4195       | 0.5502         | 0.4760     | 0.8476    | 0.4641      | 0.5998  | 0.8120       | 0.8784         | 0.8439     | 0.6971       | 0.6184         | 0.6554     | 0.4756        | 0.3235          | 0.3851      |
| 0.6715        | 7.0   | 10255 | 0.6919          | 0.7412   | 0.6246 | 0.6180 | 0.6112 | 0.6180 | 0.7412   | 0.7020   | 0.7642     | 0.7318 | 0.5513     | 0.6034       | 0.5762   | 0.3682       | 0.5962         | 0.4553     | 0.8024    | 0.4817      | 0.6020  | 0.8602       | 0.8191         | 0.8391     | 0.6611       | 0.6481         | 0.6545     | 0.4267        | 0.4130          | 0.4198      |
| 0.6562        | 8.0   | 11720 | 0.7245          | 0.7325   | 0.5985 | 0.6129 | 0.6014 | 0.6129 | 0.7325   | 0.6499   | 0.8167     | 0.7238 | 0.5320     | 0.6211       | 0.5731   | 0.3779       | 0.4728         | 0.4201     | 0.5704    | 0.6047      | 0.5871  | 0.8771       | 0.7863         | 0.8292     | 0.7431       | 0.6010         | 0.6645     | 0.4391        | 0.3875          | 0.4117      |
| 0.6426        | 9.0   | 13185 | 0.6683          | 0.7510   | 0.6304 | 0.6109 | 0.6175 | 0.6109 | 0.7510   | 0.7216   | 0.7506     | 0.7358 | 0.5768     | 0.6001       | 0.5882   | 0.3908       | 0.4603         | 0.4227     | 0.7469    | 0.5359      | 0.6240  | 0.8458       | 0.8458         | 0.8458     | 0.6966       | 0.6412         | 0.6678     | 0.4347        | 0.4425          | 0.4385      |
| 0.6278        | 10.0  | 14650 | 0.6661          | 0.7545   | 0.6427 | 0.5968 | 0.6142 | 0.5968 | 0.7545   | 0.7531   | 0.712      | 0.7320 | 0.6346     | 0.5476       | 0.5879   | 0.4574       | 0.4268         | 0.4416     | 0.7220    | 0.5476      | 0.6228  | 0.8304       | 0.8692         | 0.8494     | 0.5931       | 0.7276         | 0.6535     | 0.5084        | 0.3465          | 0.4122      |
| 0.6218        | 11.0  | 16115 | 0.6714          | 0.7507   | 0.6478 | 0.5958 | 0.6143 | 0.5958 | 0.7507   | 0.6878   | 0.7864     | 0.7338 | 0.6796     | 0.4950       | 0.5728   | 0.4181       | 0.4916         | 0.4519     | 0.7635    | 0.4963      | 0.6016  | 0.8324       | 0.8512         | 0.8417     | 0.6816       | 0.6524         | 0.6667     | 0.4719        | 0.3977          | 0.4316      |
| 0.6077        | 12.0  | 17580 | 0.6649          | 0.7543   | 0.6216 | 0.6171 | 0.6187 | 0.6171 | 0.7543   | 0.7496   | 0.7249     | 0.7371 | 0.6055     | 0.6095       | 0.6075   | 0.4449       | 0.4142         | 0.4290     | 0.6194    | 0.6076      | 0.6135  | 0.8426       | 0.8568         | 0.8497     | 0.6894       | 0.6386         | 0.6630     | 0.4           | 0.4680          | 0.4313      |
| 0.5868        | 13.0  | 19045 | 0.6680          | 0.7584   | 0.6473 | 0.6026 | 0.6224 | 0.6026 | 0.7584   | 0.7192   | 0.7522     | 0.7354 | 0.6442     | 0.5658       | 0.6025   | 0.4398       | 0.4435         | 0.4417     | 0.7127    | 0.5666      | 0.6313  | 0.8293       | 0.8711         | 0.8497     | 0.7187       | 0.6174         | 0.6642     | 0.4673        | 0.4015          | 0.4319      |
| 0.5747        | 14.0  | 20510 | 0.6692          | 0.7551   | 0.6293 | 0.6049 | 0.6155 | 0.6049 | 0.7551   | 0.7114   | 0.7621     | 0.7359 | 0.5985     | 0.6167       | 0.6075   | 0.4461       | 0.3808         | 0.4108     | 0.6088    | 0.6061      | 0.6075  | 0.8444       | 0.8522         | 0.8483     | 0.7124       | 0.6222         | 0.6642     | 0.4835        | 0.3939          | 0.4341      |
| 0.5632        | 15.0  | 21975 | 0.6763          | 0.7551   | 0.6390 | 0.6104 | 0.6185 | 0.6104 | 0.7551   | 0.6978   | 0.7812     | 0.7371 | 0.6381     | 0.5774       | 0.6063   | 0.4179       | 0.5272         | 0.4662     | 0.6260    | 0.5710      | 0.5972  | 0.8432       | 0.8479         | 0.8455     | 0.6950       | 0.6460         | 0.6696     | 0.5551        | 0.3223          | 0.4078      |
| 0.546         | 16.0  | 23440 | 0.6880          | 0.7537   | 0.6365 | 0.6089 | 0.6205 | 0.6089 | 0.7537   | 0.6906   | 0.7878     | 0.7360 | 0.6121     | 0.625        | 0.6185   | 0.4564       | 0.3828         | 0.4164     | 0.6587    | 0.6076      | 0.6321  | 0.8493       | 0.8350         | 0.8421     | 0.6999       | 0.6428         | 0.6702     | 0.4885        | 0.3811          | 0.4282      |
| 0.5354        | 17.0  | 24905 | 0.6823          | 0.7545   | 0.6399 | 0.6097 | 0.6222 | 0.6097 | 0.7545   | 0.6972   | 0.7828     | 0.7375 | 0.6131     | 0.6355       | 0.6241   | 0.4916       | 0.3682         | 0.4211     | 0.6979    | 0.5783      | 0.6325  | 0.8525       | 0.8357         | 0.8440     | 0.6820       | 0.6455         | 0.6632     | 0.4447        | 0.4220          | 0.4331      |
| 0.5103        | 18.0  | 26370 | 0.6852          | 0.7581   | 0.6440 | 0.6039 | 0.6206 | 0.6039 | 0.7581   | 0.7167   | 0.7642     | 0.7397 | 0.6203     | 0.6289       | 0.6246   | 0.4767       | 0.3849         | 0.4259     | 0.6752    | 0.5813      | 0.6247  | 0.8418       | 0.8525         | 0.8471     | 0.6674       | 0.6614         | 0.6644     | 0.5101        | 0.3542          | 0.4181      |
| 0.4972        | 19.0  | 27835 | 0.6948          | 0.7535   | 0.6350 | 0.6039 | 0.6162 | 0.6039 | 0.7535   | 0.7038   | 0.7715     | 0.7361 | 0.5989     | 0.6515       | 0.6241   | 0.4658       | 0.3703         | 0.4126     | 0.6739    | 0.5871      | 0.6275  | 0.8495       | 0.8381         | 0.8438     | 0.6631       | 0.6571         | 0.6601     | 0.4902        | 0.3517          | 0.4095      |
| 0.4801        | 20.0  | 29300 | 0.6945          | 0.7549   | 0.6320 | 0.6106 | 0.6199 | 0.6106 | 0.7549   | 0.7138   | 0.7631     | 0.7376 | 0.6130     | 0.6316       | 0.6222   | 0.4396       | 0.4038         | 0.4209     | 0.6623    | 0.5886      | 0.6233  | 0.8481       | 0.8428         | 0.8455     | 0.6626       | 0.6619         | 0.6622     | 0.4846        | 0.3824          | 0.4274      |


### Framework versions

- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.11.0