A pre-trained BERT model for Guarani (6 layers, cased). Trained on Wikipedia + Wiktionary (~800K tokens).
[MASK]