bsc-temu commited on
Commit
d480e1f
1 Parent(s): 6a26b61

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -14
README.md CHANGED
@@ -9,6 +9,33 @@ license: apache-2.0
9
 
10
  # BERTa: RoBERTa-based Catalan language model
11
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  ## Model description
13
 
14
  BERTa is a transformer-based masked language model for the Catalan language.
@@ -207,17 +234,3 @@ Below, an example of how to use the masked language modelling task with a pipeli
207
 
208
 
209
 
210
- ### BibTeX citation
211
-
212
- If you use this resource in your work, please cite our latest paper:
213
-
214
- ```bibtex
215
- @misc{armengolestape2021multilingual,
216
- title={Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? A Comprehensive Assessment for Catalan},
217
- author={Jordi Armengol{-}Estap{\'{e}} and Casimiro Pio Carrino and Carlos Rodriguez-Penagos and Ona de Gibert Bonet and Carme Armentano{-}Oller and Aitor Gonzalez{-}Agirre and Maite Melero and Marta Villegas},
218
- year={2021},
219
- eprint={2107.07903},
220
- archivePrefix={arXiv},
221
- primaryClass={cs.CL}
222
- }
223
- ```
 
9
 
10
  # BERTa: RoBERTa-based Catalan language model
11
 
12
+ ## BibTeX citation
13
+
14
+ If you use any of these resources (datasets or models) in your work, please cite our latest paper:
15
+
16
+ ```bibtex
17
+ @inproceedings{armengol-estape-etal-2021-multilingual,
18
+ title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
19
+ author = "Armengol-Estap{\'e}, Jordi and
20
+ Carrino, Casimiro Pio and
21
+ Rodriguez-Penagos, Carlos and
22
+ de Gibert Bonet, Ona and
23
+ Armentano-Oller, Carme and
24
+ Gonzalez-Agirre, Aitor and
25
+ Melero, Maite and
26
+ Villegas, Marta",
27
+ booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
28
+ month = aug,
29
+ year = "2021",
30
+ address = "Online",
31
+ publisher = "Association for Computational Linguistics",
32
+ url = "https://aclanthology.org/2021.findings-acl.437",
33
+ doi = "10.18653/v1/2021.findings-acl.437",
34
+ pages = "4933--4946",
35
+ }
36
+ ```
37
+
38
+
39
  ## Model description
40
 
41
  BERTa is a transformer-based masked language model for the Catalan language.
 
234
 
235
 
236