File size: 10,240 Bytes
f598d00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54eb066
 
 
 
 
 
 
 
 
 
 
 
f598d00
54eb066
 
 
 
 
 
 
 
 
 
 
f598d00
54eb066
 
 
f598d00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
license: mit
base_model: roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
model-index:
- name: lora-roberta-large-no-ed
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# lora-roberta-large-no-ed

This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6577
- Accuracy: 0.7631
- Prec: 0.6548
- Recall: 0.6054
- F1: 0.6277
- B Acc: 0.6054
- Micro F1: 0.7631
- Prec Joy: 0.7442
- Recall Joy: 0.7318
- F1 Joy: 0.7379
- Prec Anger: 0.6340
- Recall Anger: 0.6007
- F1 Anger: 0.6169
- Prec Disgust: 0.4641
- Recall Disgust: 0.4059
- F1 Disgust: 0.4330
- Prec Fear: 0.6923
- Recall Fear: 0.5930
- F1 Fear: 0.6388
- Prec Neutral: 0.8246
- Recall Neutral: 0.8811
- F1 Neutral: 0.8519
- Prec Sadness: 0.7164
- Recall Sadness: 0.6264
- F1 Sadness: 0.6684
- Prec Surprise: 0.5081
- Recall Surprise: 0.3990
- F1 Surprise: 0.4470

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 15.0

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Prec   | Recall | F1     | B Acc  | Micro F1 | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|:--------:|:--------:|:----------:|:------:|:----------:|:------------:|:--------:|:------------:|:--------------:|:----------:|:---------:|:-----------:|:-------:|:------------:|:--------------:|:----------:|:------------:|:--------------:|:----------:|:-------------:|:---------------:|:-----------:|
| 0.8081        | 0.75  | 1099  | 0.7901          | 0.7138   | 0.5617 | 0.5642 | 0.5601 | 0.5642 | 0.7138   | 0.7312   | 0.6492     | 0.6878 | 0.4974     | 0.5354       | 0.5157   | 0.4330       | 0.3515         | 0.3880     | 0.5277    | 0.5447      | 0.5360  | 0.8403       | 0.8288         | 0.8345     | 0.5152       | 0.6820         | 0.5870     | 0.3867        | 0.3581          | 0.3718      |
| 0.7543        | 1.5   | 2198  | 0.7482          | 0.7263   | 0.5892 | 0.5714 | 0.5737 | 0.5714 | 0.7263   | 0.6611   | 0.7786     | 0.7151 | 0.578      | 0.4795       | 0.5242   | 0.4229       | 0.4644         | 0.4427     | 0.5082    | 0.5900      | 0.5461  | 0.8353       | 0.8242         | 0.8297     | 0.6356       | 0.5999         | 0.6172     | 0.4836        | 0.2634          | 0.3411      |
| 0.7292        | 2.25  | 3297  | 0.7176          | 0.7392   | 0.6337 | 0.5729 | 0.5834 | 0.5729 | 0.7392   | 0.7077   | 0.7367     | 0.7219 | 0.6069     | 0.4928       | 0.5440   | 0.3188       | 0.5816         | 0.4119     | 0.6310    | 0.5608      | 0.5938  | 0.8031       | 0.8778         | 0.8388     | 0.8176       | 0.5034         | 0.6232     | 0.5507        | 0.2570          | 0.3505      |
| 0.7138        | 3.0   | 4396  | 0.6883          | 0.7448   | 0.6145 | 0.5918 | 0.6005 | 0.5918 | 0.7448   | 0.7004   | 0.7614     | 0.7297 | 0.5848     | 0.5819       | 0.5833   | 0.4116       | 0.4142         | 0.4129     | 0.5827    | 0.5827      | 0.5827  | 0.8380       | 0.8428         | 0.8404     | 0.6838       | 0.6222         | 0.6515     | 0.5           | 0.3376          | 0.4031      |
| 0.7046        | 3.75  | 5495  | 0.6826          | 0.7465   | 0.6275 | 0.5789 | 0.5986 | 0.5789 | 0.7465   | 0.7145   | 0.748      | 0.7309 | 0.5822     | 0.5658       | 0.5739   | 0.5220       | 0.3222         | 0.3984     | 0.6403    | 0.5212      | 0.5747  | 0.8318       | 0.8535         | 0.8425     | 0.6559       | 0.6476         | 0.6517     | 0.4457        | 0.3939          | 0.4182      |
| 0.6767        | 4.5   | 6594  | 0.6971          | 0.7436   | 0.6423 | 0.5649 | 0.5923 | 0.5649 | 0.7436   | 0.7414   | 0.7028     | 0.7216 | 0.6387     | 0.5055       | 0.5644   | 0.5714       | 0.2678         | 0.3647     | 0.6597    | 0.5564      | 0.6037  | 0.8056       | 0.8720         | 0.8375     | 0.5985       | 0.6826         | 0.6378     | 0.4807        | 0.3670          | 0.4162      |
| 0.661         | 5.25  | 7693  | 0.7124          | 0.7384   | 0.6295 | 0.6028 | 0.6031 | 0.6028 | 0.7384   | 0.6697   | 0.7991     | 0.7287 | 0.4849     | 0.7124       | 0.5771   | 0.3955       | 0.4435         | 0.4181     | 0.7064    | 0.5461      | 0.6160  | 0.8814       | 0.7958         | 0.8364     | 0.6848       | 0.6322         | 0.6575     | 0.5835        | 0.2903          | 0.3877      |
| 0.6652        | 6.0   | 8792  | 0.6706          | 0.7529   | 0.6441 | 0.5942 | 0.6136 | 0.5942 | 0.7529   | 0.7386   | 0.7306     | 0.7346 | 0.7153     | 0.4309       | 0.5378   | 0.4612       | 0.4351         | 0.4478     | 0.6354    | 0.5944      | 0.6142  | 0.8081       | 0.8841         | 0.8444     | 0.6859       | 0.6354         | 0.6597     | 0.4643        | 0.4488          | 0.4564      |
| 0.6532        | 6.75  | 9891  | 0.6567          | 0.7582   | 0.6578 | 0.5853 | 0.6146 | 0.5853 | 0.7582   | 0.7473   | 0.7264     | 0.7367 | 0.6156     | 0.5642       | 0.5887   | 0.5014       | 0.3682         | 0.4246     | 0.7068    | 0.5505      | 0.6189  | 0.8176       | 0.8815         | 0.8484     | 0.6602       | 0.6672         | 0.6637     | 0.5556        | 0.3389          | 0.4210      |
| 0.6314        | 7.5   | 10990 | 0.6726          | 0.7555   | 0.6673 | 0.5864 | 0.6142 | 0.5864 | 0.7555   | 0.7029   | 0.7795     | 0.7393 | 0.5800     | 0.6433       | 0.6100   | 0.5350       | 0.3201         | 0.4005     | 0.8117    | 0.4861      | 0.6081  | 0.8422       | 0.8456         | 0.8439     | 0.6651       | 0.6725         | 0.6688     | 0.5344        | 0.3581          | 0.4288      |
| 0.6045        | 8.25  | 12089 | 0.6668          | 0.7578   | 0.6551 | 0.6006 | 0.6238 | 0.6006 | 0.7578   | 0.7288   | 0.7468     | 0.7377 | 0.6554     | 0.5597       | 0.6038   | 0.4684       | 0.4038         | 0.4337     | 0.7683    | 0.5388      | 0.6334  | 0.8249       | 0.8693         | 0.8466     | 0.6924       | 0.6418         | 0.6661     | 0.4472        | 0.4437          | 0.4454      |
| 0.6182        | 9.0   | 13188 | 0.6659          | 0.7571   | 0.6461 | 0.6044 | 0.6205 | 0.6044 | 0.7571   | 0.7164   | 0.7602     | 0.7377 | 0.6389     | 0.5813       | 0.6087   | 0.4511       | 0.4435         | 0.4473     | 0.6770    | 0.5739      | 0.6212  | 0.8373       | 0.8555         | 0.8463     | 0.6523       | 0.6842         | 0.6679     | 0.5497        | 0.3325          | 0.4143      |
| 0.5927        | 9.75  | 14287 | 0.7097          | 0.7466   | 0.6561 | 0.5640 | 0.5952 | 0.5640 | 0.7466   | 0.7228   | 0.7136     | 0.7182 | 0.6138     | 0.5785       | 0.5957   | 0.5833       | 0.2490         | 0.3490     | 0.7201    | 0.5652      | 0.6333  | 0.8081       | 0.8686         | 0.8372     | 0.6367       | 0.6688         | 0.6524     | 0.5075        | 0.3043          | 0.3805      |
| 0.5736        | 10.5  | 15386 | 0.6663          | 0.7587   | 0.6494 | 0.6092 | 0.6225 | 0.6092 | 0.7587   | 0.7282   | 0.7576     | 0.7426 | 0.5869     | 0.6554       | 0.6193   | 0.5          | 0.3745         | 0.4282     | 0.6807    | 0.5930      | 0.6338  | 0.8502       | 0.8443         | 0.8473     | 0.6361       | 0.7122         | 0.672      | 0.5639        | 0.3274          | 0.4142      |
| 0.5687        | 11.25 | 16485 | 0.6599          | 0.7633   | 0.6595 | 0.6148 | 0.6337 | 0.6148 | 0.7633   | 0.7366   | 0.7447     | 0.7406 | 0.6489     | 0.6062       | 0.6268   | 0.4898       | 0.4519         | 0.4701     | 0.7461    | 0.5637      | 0.6422  | 0.8389       | 0.8663         | 0.8524     | 0.6401       | 0.6804         | 0.6596     | 0.5161        | 0.3900          | 0.4443      |
| 0.5652        | 12.0  | 17584 | 0.6577          | 0.7631   | 0.6548 | 0.6054 | 0.6277 | 0.6054 | 0.7631   | 0.7442   | 0.7318     | 0.7379 | 0.6340     | 0.6007       | 0.6169   | 0.4641       | 0.4059         | 0.4330     | 0.6923    | 0.5930      | 0.6388  | 0.8246       | 0.8811         | 0.8519     | 0.7164       | 0.6264         | 0.6684     | 0.5081        | 0.3990          | 0.4470      |
| 0.5377        | 12.75 | 18683 | 0.6681          | 0.7620   | 0.6422 | 0.6124 | 0.6250 | 0.6124 | 0.7620   | 0.7324   | 0.7607     | 0.7463 | 0.5952     | 0.6482       | 0.6206   | 0.4619       | 0.3808         | 0.4174     | 0.6490    | 0.5900      | 0.6181  | 0.8475       | 0.8551         | 0.8513     | 0.6912       | 0.6608         | 0.6757     | 0.5178        | 0.3913          | 0.4457      |
| 0.5312        | 13.5  | 19782 | 0.6777          | 0.7594   | 0.6362 | 0.6162 | 0.6247 | 0.6162 | 0.7594   | 0.7351   | 0.7494     | 0.7422 | 0.6058     | 0.6399       | 0.6224   | 0.4489       | 0.4226         | 0.4353     | 0.6337    | 0.6003      | 0.6165  | 0.8454       | 0.8539         | 0.8497     | 0.6744       | 0.6608         | 0.6676     | 0.5101        | 0.3862          | 0.4396      |
| 0.512         | 14.25 | 20881 | 0.6823          | 0.7569   | 0.6409 | 0.6172 | 0.6274 | 0.6172 | 0.7569   | 0.7051   | 0.7805     | 0.7409 | 0.6291     | 0.6051       | 0.6169   | 0.4830       | 0.4163         | 0.4472     | 0.6461    | 0.5988      | 0.6216  | 0.8506       | 0.8388         | 0.8447     | 0.6613       | 0.6757         | 0.6684     | 0.5113        | 0.4054          | 0.4522      |


### Framework versions

- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.11.0