Back to all models
translation mask_token:
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

Helsinki-NLP/opus-mt-en-zlw Helsinki-NLP/opus-mt-en-zlw
14 downloads
last 30 days

pytorch

tf

Contributed by

Language Technology Research Group at the University of Helsinki university
1 team member · 1325 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-zlw") model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-en-zlw")
Uploaded in S3

eng-zlw

  • source group: English

  • target group: West Slavic languages

  • OPUS readme: eng-zlw

  • model: transformer

  • source language(s): eng

  • target language(s): ces csb_Latn dsb hsb pol

  • model: transformer

  • pre-processing: normalization + SentencePiece (spm32k,spm32k)

  • a sentence initial language token is required in the form of >>id<< (id = valid target language ID)

  • download original weights: opus2m-2020-08-02.zip

  • test set translations: opus2m-2020-08-02.test.txt

  • test set scores: opus2m-2020-08-02.eval.txt

Benchmarks

testset BLEU chr-F
newssyscomb2009-engces.eng.ces 20.6 0.488
news-test2008-engces.eng.ces 18.3 0.466
newstest2009-engces.eng.ces 19.8 0.483
newstest2010-engces.eng.ces 19.8 0.486
newstest2011-engces.eng.ces 20.6 0.489
newstest2012-engces.eng.ces 18.6 0.464
newstest2013-engces.eng.ces 22.3 0.495
newstest2015-encs-engces.eng.ces 21.7 0.502
newstest2016-encs-engces.eng.ces 24.5 0.521
newstest2017-encs-engces.eng.ces 20.1 0.480
newstest2018-encs-engces.eng.ces 19.9 0.483
newstest2019-encs-engces.eng.ces 21.2 0.490
Tatoeba-test.eng-ces.eng.ces 43.7 0.632
Tatoeba-test.eng-csb.eng.csb 1.2 0.188
Tatoeba-test.eng-dsb.eng.dsb 1.5 0.167
Tatoeba-test.eng-hsb.eng.hsb 5.7 0.199
Tatoeba-test.eng.multi 42.8 0.632
Tatoeba-test.eng-pol.eng.pol 43.2 0.641

System Info: