zoraidalhf
commited on
Commit
•
150abc4
1
Parent(s):
296038e
Update README.md
Browse filesUpdate of the citation
README.md
CHANGED
@@ -38,20 +38,18 @@ In the recent years, Transformer-based models have lead to significant advances
|
|
38 |
|
39 |
|
40 |
## Citation
|
41 |
-
Link to the paper: https://arxiv.org/abs/2206.15147
|
42 |
|
43 |
Cite this work:
|
44 |
```
|
45 |
-
@
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
year = {2022},
|
53 |
-
copyright = {Creative Commons Attribution 4.0 International}
|
54 |
}
|
55 |
```
|
56 |
## Disclaimer
|
57 |
-
We did not perform any kind of filtering and/or censorship to the corpus. We expect users to do so applying their own methods. We are not
|
|
|
38 |
|
39 |
|
40 |
## Citation
|
41 |
+
Link to the paper: https://www.isca-speech.org/archive/pdfs/iberspeech_2022/gutierrezfandino22_iberspeech.pdf / https://arxiv.org/abs/2206.15147
|
42 |
|
43 |
Cite this work:
|
44 |
```
|
45 |
+
@inproceedings{gutierrezfandino22_iberspeech,
|
46 |
+
author={Asier Gutiérrez-Fandiño and David Pérez-Fernández and Jordi Armengol-Estapé and David Griol and Zoraida Callejas},
|
47 |
+
title={{esCorpius: A Massive Spanish Crawling Corpus}},
|
48 |
+
year=2022,
|
49 |
+
booktitle={Proc. IberSPEECH 2022},
|
50 |
+
pages={126--130},
|
51 |
+
doi={10.21437/IberSPEECH.2022-26}
|
|
|
|
|
52 |
}
|
53 |
```
|
54 |
## Disclaimer
|
55 |
+
We did not perform any kind of filtering and/or censorship to the corpus. We expect users to do so applying their own methods. We are not liable for any misuse of the corpus.
|