Edit model card

roberta-large-bne-caresC

This model is a finetuned version of roberta-large-bne for the Cares Chapters dataset used in a benchmark in the paper TODO. The model has a F1 of 0.84

Please refer to the original publication for more information TODO LINK

Parameters used

parameter Value
batch size 16
learning rate 2e-05
classifier dropout 0.2
warmup ratio 0
warmup steps 0
weight decay 0
optimizer AdamW
epochs 10
early stopping patience 3

BibTeX entry and citation info

TODO
Downloads last month
1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train IIC/roberta-large-bne-caresC

Collection including IIC/roberta-large-bne-caresC

Evaluation results