Edit model card

bert-base-spanish-wwm-cased-cantemist

This model is a finetuned version of bert-base-spanish-wwm-cased for the cantemist dataset used in a benchmark in the paper TODO. The model has a F1 of 0.898

Please refer to the original publication for more information TODO LINK

Parameters used

parameter Value
batch size 64
learning rate 4e05
classifier dropout 0.1
warmup ratio 0
warmup steps 0
weight decay 0
optimizer AdamW
epochs 10
early stopping patience 3

BibTeX entry and citation info

TODO
Downloads last month
3

Dataset used to train IIC/bert-base-spanish-wwm-cased-cantemist

Collection including IIC/bert-base-spanish-wwm-cased-cantemist

Evaluation results