Commit
•
480015d
1
Parent(s):
6bf83db
Update README.md
Browse files
README.md
CHANGED
@@ -63,6 +63,21 @@ fairseq-train \
|
|
63 |
|
64 |
The model was evaluated with BLEU, where we compared the reference pictogram translation with the model hypothesis.
|
65 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
66 |
### Results
|
67 |
|
68 |
Comparison to other translation models :
|
|
|
63 |
|
64 |
The model was evaluated with BLEU, where we compared the reference pictogram translation with the model hypothesis.
|
65 |
|
66 |
+
```bash
|
67 |
+
fairseq-generate exp_orfeo/data-bin/orfeo.tokenized.fr-frp \
|
68 |
+
--path exp_orfeo/checkpoints/nmt_fr_frp_orfeo/checkpoint.best_bleu_87.2803.pt \
|
69 |
+
--batch-size 128 --beam 5 --remove-bpe > gen_orfeo.out
|
70 |
+
```
|
71 |
+
The output file prints the following information :
|
72 |
+
```txt
|
73 |
+
S-16709 peut-être vous pouvez vous exprimer
|
74 |
+
T-16709 vous pouvoir exprimer
|
75 |
+
H-16709 -0.0769597738981247 vous pouvoir exprimer
|
76 |
+
D-16709 -0.0769597738981247 vous pouvoir exprimer
|
77 |
+
P-16709 -0.0936 -0.0924 -0.0065 -0.1154
|
78 |
+
Generate test with beam=5: BLEU4 = 87.43, 95.2/89.8/85.0/80.4 (BP=1.000, ratio=1.006, syslen=250949, reflen=249520)
|
79 |
+
```
|
80 |
+
|
81 |
### Results
|
82 |
|
83 |
Comparison to other translation models :
|