File size: 10,267 Bytes
1879c55
 
 
 
 
 
 
 
 
 
 
 
7fff545
1879c55
 
 
 
 
 
 
 
 
8796fcf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1879c55
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
---
license: mit
base_model: roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- f1
model-index:
- name: lora-roberta-large-no-roller
  results: []
library_name: peft
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# lora-roberta-large-no-roller

This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6492
- Accuracy: 0.7693
- Prec: 0.6783
- Recall: 0.6473
- F1: 0.6599
- B Acc: 0.6473
- Micro F1: 0.7693
- Prec Joy: 0.7606
- Recall Joy: 0.7609
- F1 Joy: 0.7607
- Prec Anger: 0.6490
- Recall Anger: 0.6564
- F1 Anger: 0.6527
- Prec Disgust: 0.4785
- Recall Disgust: 0.5179
- F1 Disgust: 0.4974
- Prec Fear: 0.7204
- Recall Fear: 0.6690
- F1 Fear: 0.6938
- Prec Neutral: 0.8316
- Recall Neutral: 0.8715
- F1 Neutral: 0.8511
- Prec Sadness: 0.7426
- Recall Sadness: 0.6653
- F1 Sadness: 0.7018
- Prec Surprise: 0.5654
- Recall Surprise: 0.3903
- F1 Surprise: 0.4618

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 15.0

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Prec   | Recall | F1     | B Acc  | Micro F1 | Prec Joy | Recall Joy | F1 Joy | Prec Anger | Recall Anger | F1 Anger | Prec Disgust | Recall Disgust | F1 Disgust | Prec Fear | Recall Fear | F1 Fear | Prec Neutral | Recall Neutral | F1 Neutral | Prec Sadness | Recall Sadness | F1 Sadness | Prec Surprise | Recall Surprise | F1 Surprise |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|:--------:|:--------:|:----------:|:------:|:----------:|:------------:|:--------:|:------------:|:--------------:|:----------:|:---------:|:-----------:|:-------:|:------------:|:--------------:|:----------:|:------------:|:--------------:|:----------:|:-------------:|:---------------:|:-----------:|
| 0.7936        | 0.75  | 1159  | 0.7811          | 0.7151   | 0.6488 | 0.5609 | 0.5848 | 0.5609 | 0.7151   | 0.6199   | 0.8341     | 0.7112 | 0.5064     | 0.6239       | 0.5590   | 0.6071       | 0.2732         | 0.3768     | 0.8054    | 0.4889      | 0.6084  | 0.8561       | 0.7799         | 0.8162     | 0.6749       | 0.5951         | 0.6325     | 0.4717        | 0.3312          | 0.3891      |
| 0.7404        | 1.5   | 2318  | 0.7121          | 0.7385   | 0.6664 | 0.5694 | 0.6008 | 0.5694 | 0.7385   | 0.7692   | 0.7050     | 0.7357 | 0.5220     | 0.6354       | 0.5732   | 0.6652       | 0.2732         | 0.3873     | 0.7082    | 0.5789      | 0.6371  | 0.7929       | 0.8857         | 0.8367     | 0.7494       | 0.5369         | 0.6256     | 0.4576        | 0.3710          | 0.4097      |
| 0.713         | 2.25  | 3477  | 0.7058          | 0.7460   | 0.6427 | 0.6191 | 0.6269 | 0.6191 | 0.7460   | 0.7403   | 0.7506     | 0.7454 | 0.5511     | 0.6387       | 0.5917   | 0.4204       | 0.5            | 0.4568     | 0.7208    | 0.5766      | 0.6407  | 0.8344       | 0.8447         | 0.8395     | 0.7142       | 0.6476         | 0.6793     | 0.5178        | 0.3753          | 0.4352      |
| 0.7173        | 3.0   | 4636  | 0.6973          | 0.7482   | 0.6831 | 0.5963 | 0.6211 | 0.5963 | 0.7482   | 0.7181   | 0.7529     | 0.7351 | 0.7473     | 0.4054       | 0.5256   | 0.4491       | 0.575          | 0.5043     | 0.7169    | 0.6070      | 0.6574  | 0.7877       | 0.8961         | 0.8384     | 0.7280       | 0.6387         | 0.6804     | 0.6347        | 0.2989          | 0.4064      |
| 0.6988        | 3.75  | 5795  | 0.6958          | 0.7445   | 0.6635 | 0.6000 | 0.6212 | 0.6000 | 0.7445   | 0.6616   | 0.8326     | 0.7373 | 0.6363     | 0.5851       | 0.6096   | 0.6263       | 0.3321         | 0.4341     | 0.6422    | 0.6047      | 0.6229  | 0.8472       | 0.8195         | 0.8331     | 0.6973       | 0.6644         | 0.6805     | 0.5333        | 0.3613          | 0.4308      |
| 0.661         | 4.5   | 6954  | 0.6763          | 0.7543   | 0.6901 | 0.6081 | 0.6333 | 0.6081 | 0.7543   | 0.7145   | 0.7805     | 0.7461 | 0.6601     | 0.5769       | 0.6157   | 0.7074       | 0.2893         | 0.4106     | 0.7588    | 0.5813      | 0.6583  | 0.8369       | 0.8462         | 0.8415     | 0.6341       | 0.7364         | 0.6815     | 0.5188        | 0.4462          | 0.4798      |
| 0.6632        | 5.25  | 8113  | 0.6745          | 0.7543   | 0.6671 | 0.6177 | 0.6386 | 0.6177 | 0.7543   | 0.7283   | 0.7653     | 0.7463 | 0.6232     | 0.6109       | 0.6170   | 0.4665       | 0.4732         | 0.4699     | 0.7629    | 0.6058      | 0.6754  | 0.8178       | 0.8689         | 0.8426     | 0.7752       | 0.5902         | 0.6702     | 0.4961        | 0.4097          | 0.4488      |
| 0.6427        | 6.0   | 9272  | 0.6729          | 0.7514   | 0.6607 | 0.6306 | 0.6403 | 0.6306 | 0.7514   | 0.8023   | 0.6707     | 0.7306 | 0.6623     | 0.5827       | 0.6199   | 0.4785       | 0.4768         | 0.4776     | 0.7312    | 0.6269      | 0.6751  | 0.8013       | 0.8919         | 0.8441     | 0.7613       | 0.6124         | 0.6788     | 0.3879        | 0.5527          | 0.4559      |
| 0.6363        | 6.75  | 10431 | 0.6584          | 0.7579   | 0.6635 | 0.6367 | 0.6475 | 0.6367 | 0.7579   | 0.7349   | 0.7887     | 0.7608 | 0.5808     | 0.6818       | 0.6273   | 0.5335       | 0.4268         | 0.4742     | 0.7103    | 0.6222      | 0.6633  | 0.8489       | 0.8333         | 0.8410     | 0.7227       | 0.6764         | 0.6988     | 0.5135        | 0.4280          | 0.4669      |
| 0.6134        | 7.5   | 11590 | 0.6490          | 0.7634   | 0.6768 | 0.6351 | 0.6538 | 0.6351 | 0.7634   | 0.7710   | 0.7383     | 0.7543 | 0.6355     | 0.6258       | 0.6306   | 0.5637       | 0.4268         | 0.4858     | 0.7412    | 0.6433      | 0.6888  | 0.8244       | 0.8720         | 0.8476     | 0.7001       | 0.6889         | 0.6944     | 0.5018        | 0.4505          | 0.4748      |
| 0.6045        | 8.25  | 12749 | 0.6494          | 0.7612   | 0.6648 | 0.6525 | 0.6555 | 0.6525 | 0.7612   | 0.7566   | 0.7571     | 0.7569 | 0.6401     | 0.6085       | 0.6239   | 0.4371       | 0.5893         | 0.5019     | 0.7557    | 0.6222      | 0.6825  | 0.8329       | 0.8613         | 0.8469     | 0.7556       | 0.6498         | 0.6987     | 0.4760        | 0.4796          | 0.4778      |
| 0.6139        | 9.0   | 13908 | 0.6585          | 0.7561   | 0.6730 | 0.6388 | 0.6499 | 0.6388 | 0.7561   | 0.7070   | 0.8144     | 0.7570 | 0.5686     | 0.7264       | 0.6379   | 0.5846       | 0.4071         | 0.48       | 0.7307    | 0.6187      | 0.6700  | 0.8639       | 0.8125         | 0.8374     | 0.7312       | 0.6698         | 0.6991     | 0.5247        | 0.4226          | 0.4681      |
| 0.5942        | 9.75  | 15067 | 0.6422          | 0.7661   | 0.6838 | 0.6436 | 0.6605 | 0.6436 | 0.7661   | 0.7413   | 0.7812     | 0.7607 | 0.6120     | 0.6938       | 0.6503   | 0.5619       | 0.4375         | 0.4920     | 0.7531    | 0.6351      | 0.6891  | 0.8407       | 0.8505         | 0.8455     | 0.7505       | 0.6684         | 0.7071     | 0.5271        | 0.4387          | 0.4789      |
| 0.5798        | 10.5  | 16226 | 0.6553          | 0.7614   | 0.6828 | 0.6358 | 0.6495 | 0.6358 | 0.7614   | 0.7338   | 0.7856     | 0.7588 | 0.5979     | 0.6905       | 0.6409   | 0.6692       | 0.3179         | 0.4310     | 0.7168    | 0.6632      | 0.6889  | 0.8489       | 0.8389         | 0.8439     | 0.7273       | 0.672          | 0.6985     | 0.4854        | 0.4828          | 0.4841      |
| 0.5513        | 11.25 | 17385 | 0.6538          | 0.7612   | 0.6640 | 0.6499 | 0.6550 | 0.6499 | 0.7612   | 0.7121   | 0.8059     | 0.7561 | 0.6487     | 0.6574       | 0.6530   | 0.5348       | 0.4393         | 0.4824     | 0.6504    | 0.6877      | 0.6686  | 0.8594       | 0.8270         | 0.8429     | 0.7018       | 0.6924         | 0.6971     | 0.5410        | 0.4398          | 0.4852      |
| 0.5544        | 12.0  | 18544 | 0.6492          | 0.7693   | 0.6783 | 0.6473 | 0.6599 | 0.6473 | 0.7693   | 0.7606   | 0.7609     | 0.7607 | 0.6490     | 0.6564       | 0.6527   | 0.4785       | 0.5179         | 0.4974     | 0.7204    | 0.6690      | 0.6938  | 0.8316       | 0.8715         | 0.8511     | 0.7426       | 0.6653         | 0.7018     | 0.5654        | 0.3903          | 0.4618      |
| 0.5229        | 12.75 | 19703 | 0.6428          | 0.7701   | 0.6822 | 0.6449 | 0.6612 | 0.6449 | 0.7701   | 0.7439   | 0.7820     | 0.7625 | 0.6647     | 0.6584       | 0.6615   | 0.5302       | 0.4393         | 0.4805     | 0.7452    | 0.6327      | 0.6844  | 0.8446       | 0.8563         | 0.8504     | 0.6988       | 0.7147         | 0.7067     | 0.5478        | 0.4312          | 0.4826      |
| 0.5222        | 13.5  | 20862 | 0.6565          | 0.7656   | 0.6705 | 0.6486 | 0.6586 | 0.6486 | 0.7656   | 0.7366   | 0.7862     | 0.7606 | 0.6581     | 0.6531       | 0.6556   | 0.5108       | 0.4643         | 0.4864     | 0.7322    | 0.6269      | 0.6755  | 0.8479       | 0.8465         | 0.8472     | 0.7176       | 0.6933         | 0.7052     | 0.4899        | 0.4699          | 0.4797      |
| 0.4972        | 14.25 | 22021 | 0.6574          | 0.7672   | 0.6725 | 0.6502 | 0.6596 | 0.6502 | 0.7672   | 0.7537   | 0.7703     | 0.7619 | 0.6368     | 0.6737       | 0.6547   | 0.4859       | 0.4929         | 0.4894     | 0.7302    | 0.6363      | 0.68    | 0.8447       | 0.8525         | 0.8486     | 0.7057       | 0.7044         | 0.7051     | 0.5506        | 0.4215          | 0.4775      |


### Framework versions

- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.11.0