|
--- |
|
license: apache-2.0 |
|
tags: |
|
- |
|
datasets: |
|
- EXIST Dataset |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: twitter_sexismo-finetuned-exist2021 |
|
results: |
|
- task: |
|
name: Text Classification |
|
type: text-classification |
|
dataset: |
|
name: EXIST Dataset |
|
type: EXIST Dataset |
|
args: es |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.86 |
|
--- |
|
|
|
# twitter_sexismo-finetuned-exist2021 |
|
|
|
This model is a fine-tuned version of [pysentimiento/robertuito-hate-speech](https://huggingface.co/pysentimiento/robertuito-hate-speech) on the EXIST dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.4 |
|
- Accuracy: 0.86 |
|
|
|
## Model description |
|
|
|
Modelo para el Hackaton de Somos NLP para detección de sexismo en twitts en español |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- my_learning_rate = 5E-5 |
|
- my_adam_epsilon = 1E-8 |
|
- my_number_of_epochs = 8 |
|
- my_warmup = 3 |
|
- my_mini_batch_size = 32 |
|
- optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 8 |
|
|
|
### Training results |
|
Epoch Training Loss Validation Loss Accuracy F1 Precision Recall |
|
|
|
1 0.398400 0.336709 0.861404 0.855311 0.872897 0.838420 |
|
|
|
2 0.136100 0.575872 0.846491 0.854772 0.794753 0.924596 |
|
|
|
3 0.105600 0.800685 0.848246 0.837863 0.876471 0.802513 |
|
|
|
4 0.066500 0.928388 0.849123 0.856187 0.801252 0.919210 |
|
|
|
5 0.004500 0.990655 0.851754 0.853680 0.824415 0.885099 |
|
|
|
6 0.005500 1.035315 0.852632 0.856164 0.818331 0.897666 |
|
|
|
7 0.000200 1.052970 0.857895 0.859375 0.831933 0.888689 |
|
|
|
8 0.001700 1.048338 0.856140 0.857143 0.832487 0.883303 |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.17.0 |
|
- Pytorch 1.10.0+cu111 |
|
- Tokenizers 0.11.6 |
|
|