Update README.md
Browse files
README.md
CHANGED
@@ -37,6 +37,8 @@ library_name: transformers
|
|
37 |
|
38 |
It is a 1.3B parameters model, based on [GPTNeo](https://huggingface.co/EleutherAI/gpt-neo-1.3B), which has 24 layers and a hidden size of 2048.
|
39 |
|
|
|
|
|
40 |
## Training Data
|
41 |
**GlórIA 1.3B** was trained on a large corpora, with approximately 35B tokens. This corpus was built by gathering multiple Portuguese sources:
|
42 |
- [ArquivoPT News PT-PT Dataset](): A collection of 1.4M European Portuguese archived news and periodicals from [Arquivo.pt](https://arquivo.pt/).
|
|
|
37 |
|
38 |
It is a 1.3B parameters model, based on [GPTNeo](https://huggingface.co/EleutherAI/gpt-neo-1.3B), which has 24 layers and a hidden size of 2048.
|
39 |
|
40 |
+
You can check our [paper](https://aclanthology.org/2024.propor-1.45/), accepted in PROPOR 2024.
|
41 |
+
|
42 |
## Training Data
|
43 |
**GlórIA 1.3B** was trained on a large corpora, with approximately 35B tokens. This corpus was built by gathering multiple Portuguese sources:
|
44 |
- [ArquivoPT News PT-PT Dataset](): A collection of 1.4M European Portuguese archived news and periodicals from [Arquivo.pt](https://arquivo.pt/).
|