File size: 2,643 Bytes
57b97bd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20348a3
57b97bd
 
 
 
 
 
 
 
 
20348a3
57b97bd
 
 
20348a3
57b97bd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
license: mit
base_model: roberta-base
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: irony_de_Austria
  results: []
---


# irony_de_Austria

This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on part of the MultiPICo dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0034
- Accuracy: 0.6376
- Precision: 0.4444
- Recall: 0.7299
- F1: 0.5525

## Model description

The model is trained considering the annotation of annotators from Austria only, on instances in German (all linguistic varieties). The annotations from these annotators are aggregated using majority voting and then used to train the model.

## Training and evaluation data

The model has been trained on the annotation from annotators from Austria from the MultiPICo dataset. The data has been randomly split into a train and a validation set.

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 0.0044        | 1.0   | 84   | 0.0042          | 0.5928   | 0.3953    | 0.6204 | 0.4830 |
| 0.0041        | 2.0   | 168  | 0.0040          | 0.5638   | 0.3858    | 0.7153 | 0.5013 |
| 0.0038        | 3.0   | 252  | 0.0039          | 0.4966   | 0.3562    | 0.7956 | 0.4921 |
| 0.0037        | 4.0   | 336  | 0.0034          | 0.6577   | 0.4592    | 0.6569 | 0.5405 |
| 0.0032        | 5.0   | 420  | 0.0032          | 0.6600   | 0.4603    | 0.6350 | 0.5337 |
| 0.0031        | 6.0   | 504  | 0.0032          | 0.5213   | 0.3729    | 0.8248 | 0.5136 |
| 0.0029        | 7.0   | 588  | 0.0031          | 0.5884   | 0.4064    | 0.7445 | 0.5258 |
| 0.0024        | 8.0   | 672  | 0.0029          | 0.5839   | 0.4089    | 0.8029 | 0.5419 |
| 0.002         | 9.0   | 756  | 0.0032          | 0.6465   | 0.4483    | 0.6642 | 0.5353 |
| 0.0021        | 10.0  | 840  | 0.0028          | 0.6331   | 0.4395    | 0.7153 | 0.5444 |
| 0.0016        | 11.0  | 924  | 0.0038          | 0.6823   | 0.4855    | 0.6131 | 0.5419 |
| 0.0009        | 12.0  | 1008 | 0.0034          | 0.6376   | 0.4444    | 0.7299 | 0.5525 |


### Framework versions

- Transformers 4.34.1
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.14.1