Jacobo commited on
Commit
ab8794a
1 Parent(s): 43c8b89

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -17,7 +17,9 @@ widget:
17
 
18
  aristoBERTo is a pre-trained model for ancient Greek, a low resource language. We initialized the pre-training with weights from [GreekBERT](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1), a Greek version of BERT pre-trained on a large corpus of modern Greek (~ 30 GB of texts). We continued the pre-training with an ancient Greek corpus of about 900 MB, which was scrapped from the web and post-processed. Duplicate texts and editorial punctuation were removed.
19
 
20
- Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and mdenberta in most downstream fine-tuning tasks like the labeling of POS, MORPH, DEP and LEMMA. aristoBERTo is provided by the Diogenet project of the University of California, San Diego.
 
 
21
 
22
 
23
  ## Intended uses
 
17
 
18
  aristoBERTo is a pre-trained model for ancient Greek, a low resource language. We initialized the pre-training with weights from [GreekBERT](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1), a Greek version of BERT pre-trained on a large corpus of modern Greek (~ 30 GB of texts). We continued the pre-training with an ancient Greek corpus of about 900 MB, which was scrapped from the web and post-processed. Duplicate texts and editorial punctuation were removed.
19
 
20
+ Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and mdeberta in most downstream tasks like the labeling of POS, MORPH, DEP and LEMMA.
21
+
22
+ aristoBERTo is provided by the Diogenet project of the University of California, San Diego.
23
 
24
 
25
  ## Intended uses