basilis commited on
Commit
feb99a7
1 Parent(s): d4ef170

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -34,7 +34,7 @@ The pre-training corpora of `GreekLegalRoBERTa_v3` include:
34
  * We develop the code in [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers). We publish our code in AI-team-UoA GitHub repository (https://github.com/AI-team-UoA/GreekLegalRoBERTa).
35
  * We released a model similar to the English `FacebookAI/roberta-base` for greek legislative applications model (12-layer, 768-hidden, 12-heads, 125M parameters).
36
  * We train for 100k training steps with batch size of 4096 sequences of length 512 with an initial learning rate 6e-4.
37
- * We pretrained our models using 4 v-100 GPUs provided by [Cyprus Research Institute](https://www.bing.com/search?pglt=41&q=Cyprus+Re-+search+Institute&cvid=5a277677e3e740a7a775c9ca0b342baa&gs_lcrp=EgZjaHJvbWUqBggAEEUYOzIGCAAQRRg7MgYIARAAGEAyBggCEAAYQDIGCAMQABhAMgYIBBAAGEAyBggFEAAYQDIGCAYQABhAMgYIBxAAGEAyBggIEAAYQNIBBzI1NmowajGoAgCwAgA&FORM=ANNTA1&PC=EDGEDSE). We would like to express our sincere gratitude to the Cyprus Research Institute for providing us with access to Cyclone. Without your support, this work would not have been possible.
38
 
39
 
40
  ## Requirements
 
34
  * We develop the code in [Hugging Face](https://huggingface.co)'s [Transformers](https://github.com/huggingface/transformers). We publish our code in AI-team-UoA GitHub repository (https://github.com/AI-team-UoA/GreekLegalRoBERTa).
35
  * We released a model similar to the English `FacebookAI/roberta-base` for greek legislative applications model (12-layer, 768-hidden, 12-heads, 125M parameters).
36
  * We train for 100k training steps with batch size of 4096 sequences of length 512 with an initial learning rate 6e-4.
37
+ * We pretrained our models using 4 v-100 GPUs provided by [Cyprus Research Institute](https://www.cyi.ac.cy/index.php/research/research-centers.html). We would like to express our sincere gratitude to the Cyprus Research Institute for providing us with access to Cyclone. Without your support, this work would not have been possible.
38
 
39
 
40
  ## Requirements