metadata
language:
- ca
license: '???'
tags:
- catalan
- textual entailment
- teca
- CaText
- Catalan Textual Corpus
datasets:
- projecte-aina/teca
metrics:
- accuracy
model-index:
- name: roberta-base-ca-cased-te
results:
- task:
type: text-classification
dataset:
type: projecte-aina/teca
name: teca
metrics:
- type: accuracy
value: 0.7912139892578125
widget:
- text: M'agrades. T'estimo.
- text: M'agrada el sol i la calor. A la Garrotxa plou molt.
- text: El llibre va caure per la finestra. El llibre va sortir volant.
- text: El meu aniversari és el 23 de maig. Faré anys a finals de maig.
Catalan RoBERTa-base trained on Catalan Textual Corpus fine-tuned for Catalan Textual Entailment.
The roberta-base-ca-cased-te is a Textual Entailment (TE) model for the Catalan language fine-tuned from the BERTa model, a RoBERTa base model pre-trained on a medium-size corpus collected from publicly available corpora and crawlers (check the BERTa model card for more details).
Datasets
We used the TE dataset in Catalan called TECA for training and evaluation.
Evaluation and results
Below, the evaluation result on the TECA test set:
Task | TECA (accuracy) |
---|---|
BERTa | 79.12 |
For more details, check the fine-tuning and evaluation scripts in the official GitHub repository. |
Citing
If you use any of these resources (datasets or models) in your work, please cite our latest paper:
@inproceedings{armengol-estape-etal-2021-multilingual,
title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
author = "Armengol-Estap{\'e}, Jordi and
Carrino, Casimiro Pio and
Rodriguez-Penagos, Carlos and
de Gibert Bonet, Ona and
Armentano-Oller, Carme and
Gonzalez-Agirre, Aitor and
Melero, Maite and
Villegas, Marta",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.437",
doi = "10.18653/v1/2021.findings-acl.437",
pages = "4933--4946",
}
Funding
TODO
Disclaimer
TODO