File size: 1,960 Bytes
49910c1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f6790c1
49910c1
 
cff8dd4
49910c1
 
 
f6790c1
 
49910c1
 
 
f4ea2c4
49910c1
 
 
 
 
 
 
 
 
 
 
 
 
 
f6790c1
49910c1
f6790c1
49910c1
 
 
 
f6790c1
49910c1
 
9e5fc6e
999371d
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
9e5fc6e
f6790c1
49910c1
9e5fc6e
49910c1
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
license: apache-2.0
tags:
- 
datasets:
- EXIST Dataset
metrics:
- accuracy
model-index:
- name: twitter_sexismo-finetuned-exist2021
  results:
  - task:
      name: Text Classification
      type: text-classification
    dataset:
      name: EXIST Dataset
      type: EXIST Dataset
      args: es
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.86
---

# twitter_sexismo-finetuned-exist2021

This model is a fine-tuned version of [pysentimiento/robertuito-hate-speech](https://huggingface.co/pysentimiento/robertuito-hate-speech) on the EXIST dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4
- Accuracy: 0.86

## Model description

Modelo para el Hackaton de Somos NLP para detección de sexismo en twitts en español

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- my_learning_rate = 5E-5 
- my_adam_epsilon = 1E-8 
- my_number_of_epochs = 8
- my_warmup = 3
- my_mini_batch_size = 32
- optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8

### Training results
Epoch 	Training Loss 	Validation Loss 	Accuracy 	F1 	Precision 	Recall

1 	      0.398400 	0.336709 	0.861404 	0.855311 	0.872897 	0.838420

2 	      0.136100 	0.575872 	0.846491 	0.854772 	0.794753 	0.924596

3 	      0.105600 	0.800685 	0.848246 	0.837863 	0.876471 	0.802513

4 	      0.066500 	0.928388 	0.849123 	0.856187 	0.801252 	0.919210

5 	      0.004500 	0.990655 	0.851754 	0.853680 	0.824415 	0.885099

6 	      0.005500 	1.035315 	0.852632 	0.856164 	0.818331 	0.897666

7 	      0.000200 	1.052970 	0.857895 	0.859375 	0.831933 	0.888689

8 	      0.001700 	1.048338 	0.856140 	0.857143 	0.832487 	0.883303


### Framework versions

- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Tokenizers 0.11.6