metadata
language: es
tags:
- biomedical
- clinical
- spanish
- xlm-roberta-large
license: mit
datasets:
- IIC/livingner3
metrics:
- f1
model-index:
- name: IIC/xlm-roberta-large-livingner3
results:
- task:
type: multi-label-classification
dataset:
name: livingner3
type: IIC/livingner3
split: test
metrics:
- name: f1
type: f1
value: 0.606
pipeline_tag: text-classification
xlm-roberta-large-livingner3
This model is a finetuned version of xlm-roberta-large for the livingner3 dataset used in a benchmark in the paper TODO. The model has a F1 of 0.606
Please refer to the original publication for more information TODO LINK
Parameters used
parameter | Value |
---|---|
batch size | 16 |
learning rate | 2e-05 |
classifier dropout | 0 |
warmup ratio | 0 |
warmup steps | 0 |
weight decay | 0 |
optimizer | AdamW |
epochs | 10 |
early stopping patience | 3 |
BibTeX entry and citation info
TODO