Edit model card

mBERT+gn-base-cased (multilingual-BERT+gn-base-cased)

BERT multilingual base model (cased, pre-trained BERT model) fine-tuned for Guarani language modeling (104 languages + gn). Trained on Wikipedia + Wiktionary (~800K tokens).

How cite?

@article{aguero-et-al2023multi-affect-low-langs-grn,
  title={Multidimensional Affective Analysis for Low-resource Languages: A Use Case with Guarani-Spanish Code-switching Language},
  author={Agüero-Torales, Marvin Matías, López-Herrera, Antonio Gabriel, and Vilares, David},
  journal={Cognitive Computation},
  year={2023},
  publisher={Springer},
  notes={Forthcoming}
}
Downloads last month
2

Dataset used to train mmaguero/multilingual-bert-gn-base-cased

Space using mmaguero/multilingual-bert-gn-base-cased 1