Update README.md
Browse files
README.md
CHANGED
@@ -34,7 +34,7 @@ The pre-training corpora of `GreekLegalRoBERTa_v3` include:
|
|
34 |
* We develop the code in [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers). We publish our code in AI-team-UoA GitHub repository (https://github.com/AI-team-UoA/GreekLegalRoBERTa).
|
35 |
* We released a model similar to the English `FacebookAI/roberta-base` for greek legislative applications model (12-layer, 768-hidden, 12-heads, 125M parameters).
|
36 |
* We train for 100k training steps with batch size of 4096 sequences of length 512 with an initial learning rate 6e-4.
|
37 |
-
* We pretrained our models using 4 v-100 GPUs provided by [Cyprus Research Institute](https://www.
|
38 |
|
39 |
|
40 |
## Requirements
|
|
|
34 |
* We develop the code in [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers). We publish our code in AI-team-UoA GitHub repository (https://github.com/AI-team-UoA/GreekLegalRoBERTa).
|
35 |
* We released a model similar to the English `FacebookAI/roberta-base` for greek legislative applications model (12-layer, 768-hidden, 12-heads, 125M parameters).
|
36 |
* We train for 100k training steps with batch size of 4096 sequences of length 512 with an initial learning rate 6e-4.
|
37 |
+
* We pretrained our models using 4 v-100 GPUs provided by [Cyprus Research Institute](https://www.cyi.ac.cy/index.php/research/research-centers.html). We would like to express our sincere gratitude to the Cyprus Research Institute for providing us with access to Cyclone. Without your support, this work would not have been possible.
|
38 |
|
39 |
|
40 |
## Requirements
|