opus-mt-cau-en

cau-eng

Benchmarks

testset BLEU chr-F
Tatoeba-test.abk-eng.abk.eng 0.3 0.134
Tatoeba-test.ady-eng.ady.eng 0.4 0.104
Tatoeba-test.che-eng.che.eng 0.6 0.128
Tatoeba-test.kat-eng.kat.eng 18.6 0.366
Tatoeba-test.multi.eng 16.6 0.351

System Info:

Downloads last month
23
Hosted inference API
Translation