Jacobo commited on
Commit
62e3f4f
1 Parent(s): 05be74c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -1,9 +1,15 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
 
 
4
  model-index:
5
  - name: dioBERTo
6
  results: []
 
 
 
 
7
  ---
8
 
9
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -11,8 +17,10 @@ should probably proofread and complete it, then remove this comment. -->
11
 
12
  # dioBERTo
13
 
14
- This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
 
15
  It achieves the following results on the evaluation set:
 
16
  - Loss: 3.3351
17
 
18
  ## Model description
1
  ---
2
  tags:
3
  - generated_from_trainer
4
+ language:
5
+ - grc
6
  model-index:
7
  - name: dioBERTo
8
  results: []
9
+ widget:
10
+ - text: "Πλάτων ὁ Περικτιόνης <mask> γένος ἀνέφερεν εἰς Σόλωνα."
11
+ - text: "ὁ Κριτίας ἀπέβλεψε <mask> τὴν θύραν."
12
+
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
 
18
  # dioBERTo
19
 
20
+ This is an experimental roberta model trained with an ancient Greek corpus of about 900 MB, which was scrapped from the web and post-processed. Duplicate texts and editorial punctuation were removed. The training dataset will be soon available in the Huggingface datasets hub. Training a model of ancient Greek is challenging given that it is a low resource language from which 50% of the register has only survived in fragmentary texts. The model is provided by the Diogenet project at the University of California, San Diego.
21
+
22
  It achieves the following results on the evaluation set:
23
+
24
  - Loss: 3.3351
25
 
26
  ## Model description