lighteternal commited on
Commit
56fb6b2
1 Parent(s): 169fa69

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -11
README.md CHANGED
@@ -3,21 +3,23 @@ tags:
3
  - translation
4
  ---
5
 
 
6
 
 
 
 
 
 
 
 
7
 
8
- | language | thumbnail | tags | license | datasets | metrics |
9
- | ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |
10
- | English-Greek | lighteternal/SSE-TUC-mt-en-el-cased | translation | Apache2 |Opus, CC-Matrix |BLEU, chrF |
11
-
12
- # English to Greek NMT from Hellenic Army Academy (SSE) and Technical University of Crete (TUC)
13
-
14
- ## Model description
15
 
16
  Trained using the Fairseq framework, transformer_iwslt_de_en architecture.\
17
  BPE segmentation (20k codes).\
18
- Mixed-case model. \
19
 
20
- #### How to use
21
 
22
  ```
23
  from transformers import FSMTTokenizer, FSMTForConditionalGeneration
@@ -49,14 +51,14 @@ Consolidated corpus from Opus and CC-Matrix (~6.6GB in total)
49
  ## Eval results
50
 
51
 
52
- Results on Tatoeba testset (EN-EL): \
53
 
54
  | BLEU | chrF |
55
  | ------ | ------ |
56
  | 76.9 | 0.733 |
57
 
58
 
59
- Results on XNLI parallel (EN-EL): \
60
 
61
  | BLEU | chrF |
62
  | ------ | ------ |
 
3
  - translation
4
  ---
5
 
6
+ ## English to Greek NMT from Hellenic Army Academy (SSE) and Technical University of Crete (TUC)
7
 
8
+ * source languages: en
9
+ * target languages: el
10
+ * licence: apache-2.0
11
+ * dataset: opus, ccmatric
12
+ * model: transformer(fairseq)
13
+ * pre-processing: tokenization+BPE segmentation
14
+ * metrics: bleu, chrf
15
 
16
+ ### Model description
 
 
 
 
 
 
17
 
18
  Trained using the Fairseq framework, transformer_iwslt_de_en architecture.\
19
  BPE segmentation (20k codes).\
20
+ Mixed-case model.
21
 
22
+ ### How to use
23
 
24
  ```
25
  from transformers import FSMTTokenizer, FSMTForConditionalGeneration
 
51
  ## Eval results
52
 
53
 
54
+ Results on Tatoeba testset (EN-EL):
55
 
56
  | BLEU | chrF |
57
  | ------ | ------ |
58
  | 76.9 | 0.733 |
59
 
60
 
61
+ Results on XNLI parallel (EN-EL):
62
 
63
  | BLEU | chrF |
64
  | ------ | ------ |