Jacobo commited on
Commit
fa1d531
1 Parent(s): ab8794a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -19,12 +19,12 @@ aristoBERTo is a pre-trained model for ancient Greek, a low resource language.
19
 
20
  Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and mdeberta in most downstream tasks like the labeling of POS, MORPH, DEP and LEMMA.
21
 
22
- aristoBERTo is provided by the Diogenet project of the University of California, San Diego.
23
 
24
 
25
  ## Intended uses
26
 
27
- This model was created for fine-tuning with spaCy and the Universal Dependency datasets for ancient Greek and a NER annotated corpus produced by the Diogenet project.
28
 
29
 
30
  It achieves the following results on the evaluation set:
 
19
 
20
  Applied to the processing of ancient Greek, aristoBERTo outperforms xlm-roberta-base and mdeberta in most downstream tasks like the labeling of POS, MORPH, DEP and LEMMA.
21
 
22
+ aristoBERTo is provided by the [Diogenet project](https://diogenet.ucsd.edu) of the University of California, San Diego.
23
 
24
 
25
  ## Intended uses
26
 
27
+ This model was created for fine-tuning with spaCy and the ancient Greek Universal Dependency datasets and a NER corpus produced by the Diogenet project.
28
 
29
 
30
  It achieves the following results on the evaluation set: