Edit model card

bertimbau-base-ulyssesner_br-bcod-self_learning

This model is a fine-tuned version of the neuralmind/bert-large-portuguese-cased, utilizing self-learning techniques with data sourced from the UlyssesNER-Br corpus and bill summaries from the Brazilian Chamber of Deputies.

Paper accepted at 16th International Conference on Computational Processing of Portuguese (PROPOR 2024).

Paper Link: https://aclanthology.org/2024.propor-1.30/

Bibtex:

@inproceedings{nunes-etal-2024-named, title = "A Named Entity Recognition Approach for {P}ortuguese Legislative Texts Using Self-Learning", author = "Nunes, Rafael Oleques and Balreira, Dennis Giovani and Spritzer, Andr{'e} Suslik and Freitas, Carla Maria Dal Sasso", editor = "Gamallo, Pablo and Claro, Daniela and Teixeira, Ant{'o}nio and Real, Livy and Garcia, Marcos and Oliveira, Hugo Gon{\c{c}}alo and Amaro, Raquel", booktitle = "Proceedings of the 16th International Conference on Computational Processing of Portuguese", month = mar, year = "2024", address = "Santiago de Compostela, Galicia/Spain", publisher = "Association for Computational Lingustics", url = "https://aclanthology.org/2024.propor-1.30", pages = "290--300", }

Downloads last month
3

Dataset used to train ronunes/bertimbau-base-ulyssesner_br-bcod-self_learning