BERT for Galician (Small)

This is a small pre-trained BERT model (6 layers, cased) for Galician (ILG/RAG spelling). It was evaluated on lexical semantics tasks, using a dataset to identify homonymy and synonymy in context, and presented at ACL 2021.

There is also a base version (12 layers, cased): marcosgg/bert-base-gl-cased


If you use this model, please cite the following paper:

    title = "Exploring the Representation of Word Meanings in Context: {A} Case Study on Homonymy and Synonymy",
    author = "Garcia, Marcos",
    booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
    year = "2021",
    publisher = "Association for Computational Linguistics",
    url = "",
    doi = "10.18653/v1/2021.acl-long.281",
    pages = "3625--3640"

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
Hosted inference API
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.