--- language: multilingual widget: - text: πŸ€—πŸ€—πŸ€— - text: πŸ”₯The goal of life is . πŸ”₯ - text: Il segreto della vita Γ¨ l’ . ❀️ - text: Hasta πŸ‘‹! license: mit --- # Twitter-XLM-Roberta-large This is a XLM-T large language model specialised on Twitter. The base model is a multilingual XLM-R which was re-trained on over 1 billion tweets from different languages until December 2022. To evaluate this and other LMs on Twitter-specific data, please refer to the [XLM-T main repository](https://github.com/cardiffnlp/xlm-t). A base-size XLM-T model and sample code is available [here](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base). Finally, this model is fully compatible with the [TweetNLP library](https://github.com/cardiffnlp/tweetnlp). ### BibTeX entry and citation info More information in the reference papers about [multilingual language models on Twitter](https://aclanthology.org/2022.lrec-1.27/) and [time-specific models](https://aclanthology.org/2022.acl-demo.25/). Please cite the relevant reference papers if you use this model. ```bibtex @inproceedings{barbieri-etal-2022-xlm, title = "{XLM}-{T}: Multilingual Language Models in {T}witter for Sentiment Analysis and Beyond", author = "Barbieri, Francesco and Espinosa Anke, Luis and Camacho-Collados, Jose", booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference", month = jun, year = "2022", address = "Marseille, France", publisher = "European Language Resources Association", url = "https://aclanthology.org/2022.lrec-1.27", pages = "258--266" } @inproceedings{loureiro-etal-2022-timelms, title = "{T}ime{LM}s: Diachronic Language Models from {T}witter", author = "Loureiro, Daniel and Barbieri, Francesco and Neves, Leonardo and Espinosa Anke, Luis and Camacho-collados, Jose", booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations", month = may, year = "2022", address = "Dublin, Ireland", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2022.acl-demo.25", doi = "10.18653/v1/2022.acl-demo.25", pages = "251--260" }