File size: 771 Bytes
334279d
 
376900a
334279d
 
 
 
7926cd6
 
 
 
 
334279d
 
 
 
 
7926cd6
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
language: gn
license: mit
datasets:
- wikipedia
- wiktionary
widget:
- text: 'Paraguay ha''e peteĩ táva oĩva [MASK] retãme  '
- text: Augusto Roa Bastos ha'e peteĩ [MASK] arandu
metrics:
- accuracy
- f1
---

# BERT-i-large-cased (gnBERT-large-cased)

A pre-trained BERT model for **Guarani** (24 layers, cased). Trained on Wikipedia + Wiktionary (~800K tokens).

# How cite?

```
@article{aguero-et-al2023multi-affect-low-langs-grn,
  title={Multidimensional Affective Analysis for Low-resource Languages: A Use Case with Guarani-Spanish Code-switching Language},
  author={Agüero-Torales, Marvin Matías, López-Herrera, Antonio Gabriel, and Vilares, David},
  journal={Cognitive Computation},
  year={2023},
  publisher={Springer},
  notes={Forthcoming}
}
```