Edit model card

cel-eng

Benchmarks

testset BLEU chr-F
Tatoeba-test.bre-eng.bre.eng 17.2 0.385
Tatoeba-test.cor-eng.cor.eng 3.0 0.172
Tatoeba-test.cym-eng.cym.eng 41.5 0.582
Tatoeba-test.gla-eng.gla.eng 15.4 0.330
Tatoeba-test.gle-eng.gle.eng 50.8 0.668
Tatoeba-test.glv-eng.glv.eng 11.0 0.297
Tatoeba-test.multi.eng 22.8 0.398

System Info:

Downloads last month
11

Spaces using Helsinki-NLP/opus-mt-cel-en 2