Edit model card

LEPISZCZE-aspectemo-allegro__herbert-base-cased-v1

Description

Finetuned allegro/herbert-base-cased model on clarin-pl/aspectemo dataset.

Trained via clarin-pl-embeddings library, included in LEPISZCZE benchmark.

Results on clarin-pl/aspectemo

accuracy f1_macro f1_micro f1_weighted recall_macro recall_micro recall_weighted precision_macro precision_micro precision_weighted
value 0.952 0.368 0.585 0.586 0.371 0.566 0.566 0.392 0.606 0.617

Metrics per class

precision recall f1 support
a_amb 0.2 0.033 0.057 91
a_minus_m 0.632 0.542 0.584 1033
a_minus_s 0.156 0.209 0.178 67
a_plus_m 0.781 0.694 0.735 1015
a_plus_s 0.153 0.22 0.18 41
a_zero 0.431 0.529 0.475 501

Finetuning hyperparameters

Hyperparameter Name Value
use_scheduler True
optimizer AdamW
warmup_steps 25
learning_rate 0.0005
adam_epsilon 1e-05
weight_decay 0
finetune_last_n_layers 4
classifier_dropout 0.2
max_seq_length 512
batch_size 64
max_epochs 20
early_stopping_monitor val/Loss
early_stopping_mode min
early_stopping_patience 3

Citation (BibTeX)

@article{augustyniak2022way,
  title={This is the way: designing and compiling LEPISZCZE, a comprehensive NLP benchmark for Polish},
  author={Augustyniak, Lukasz and Tagowski, Kamil and Sawczyn, Albert and Janiak, Denis and Bartusiak, Roman and Szymczak, Adrian and Janz, Arkadiusz and Szyma{'n}ski, Piotr and W{\k{a}}troba, Marcin and Morzy, Miko{\l}aj and others},
  journal={Advances in Neural Information Processing Systems},
  volume={35},
  pages={21805--21818},
  year={2022}
}
Downloads last month
2

Dataset used to train clarin-knext/LEPISZCZE-aspectemo-allegro__herbert-base-cased-v1