eduagarcia
commited on
Commit
•
c4dd803
1
Parent(s):
92932c3
Add paper link
Browse files
README.md
CHANGED
@@ -92,12 +92,12 @@ metrics:
|
|
92 |
---
|
93 |
# RoBERTaLexPT-base
|
94 |
|
95 |
-
RoBERTaLexPT-base is a Portuguese Masked Language Model pretrained from scratch from the [LegalPT](https://huggingface.co/datasets/eduagarcia/LegalPT_dedup) and [CrawlPT](https://huggingface.co/datasets/eduagarcia/CrawlPT_dedup) corpora, using the same architecture as [RoBERTa-base](https://huggingface.co/FacebookAI/roberta-base), introduced by
|
96 |
|
97 |
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
|
98 |
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
|
99 |
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
|
100 |
-
- **Paper:**
|
101 |
|
102 |
## Evaluation
|
103 |
|
|
|
92 |
---
|
93 |
# RoBERTaLexPT-base
|
94 |
|
95 |
+
RoBERTaLexPT-base is a Portuguese Masked Language Model pretrained from scratch from the [LegalPT](https://huggingface.co/datasets/eduagarcia/LegalPT_dedup) and [CrawlPT](https://huggingface.co/datasets/eduagarcia/CrawlPT_dedup) corpora, using the same architecture as [RoBERTa-base](https://huggingface.co/FacebookAI/roberta-base), introduced by Liu et al. (2019).
|
96 |
|
97 |
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
|
98 |
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
|
99 |
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
|
100 |
+
- **Paper:** https://aclanthology.org/2024.propor-1.38/
|
101 |
|
102 |
## Evaluation
|
103 |
|