cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual
This model is a fine-tuned version of cardiffnlp/twitter-xlm-roberta-base on the
cardiffnlp/tweet_sentiment_multilingual (all)
via tweetnlp
.
Training split is train
and parameters have been tuned on the validation split validation
.
Following metrics are achieved on the test split test
(link).
- F1 (micro): 0.6931034482758621
- F1 (macro): 0.692628774202147
- Accuracy: 0.6931034482758621
Usage
Install tweetnlp via pip.
pip install tweetnlp
Load the model in python.
import tweetnlp
model = tweetnlp.Classifier("cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual", max_length=128)
model.predict('Get the all-analog Classic Vinyl Edition of "Takin Off" Album from {@herbiehancock@} via {@bluenoterecords@} link below {{URL}}')
Reference
@inproceedings{camacho-collados-etal-2022-tweetnlp,
title = "{T}weet{NLP}: Cutting-Edge Natural Language Processing for Social Media",
author = "Camacho-collados, Jose and
Rezaee, Kiamehr and
Riahi, Talayeh and
Ushio, Asahi and
Loureiro, Daniel and
Antypas, Dimosthenis and
Boisson, Joanne and
Espinosa Anke, Luis and
Liu, Fangyu and
Mart{\'\i}nez C{\'a}mara, Eugenio" and others,
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = dec,
year = "2022",
address = "Abu Dhabi, UAE",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-demos.5",
pages = "38--49"
}
- Downloads last month
- 2,512,135
Model tree for cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual
Dataset used to train cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual
Spaces using cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual 3
Evaluation results
- Micro F1 (cardiffnlp/tweet_sentiment_multilingual/all) on cardiffnlp/tweet_sentiment_multilingualtest set self-reported0.693
- Macro F1 (cardiffnlp/tweet_sentiment_multilingual/all) on cardiffnlp/tweet_sentiment_multilingualtest set self-reported0.693
- Accuracy (cardiffnlp/tweet_sentiment_multilingual/all) on cardiffnlp/tweet_sentiment_multilingualtest set self-reported0.693