language:
- de
- eu
tags:
- translation
license: apache-2.0
deu-eus
source group: German
target group: Basque
OPUS readme: deu-eus
model: transformer-align
source language(s): deu
target language(s): eus
model: transformer-align
pre-processing: normalization + SentencePiece (spm12k,spm12k)
download original weights: opus-2020-06-16.zip
test set translations: opus-2020-06-16.test.txt
test set scores: opus-2020-06-16.eval.txt
Benchmarks
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.deu.eus | 31.8 | 0.574 |
System Info:
hf_name: deu-eus
source_languages: deu
target_languages: eus
opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/deu-eus/README.md
original_repo: Tatoeba-Challenge
tags: ['translation']
languages: ['de', 'eu']
src_constituents: {'deu'}
tgt_constituents: {'eus'}
src_multilingual: False
tgt_multilingual: False
prepro: normalization + SentencePiece (spm12k,spm12k)
url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/deu-eus/opus-2020-06-16.zip
url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/deu-eus/opus-2020-06-16.test.txt
src_alpha3: deu
tgt_alpha3: eus
short_pair: de-eu
chrF2_score: 0.574
bleu: 31.8
brevity_penalty: 0.9209999999999999
ref_len: 2829.0
src_name: German
tgt_name: Basque
train_date: 2020-06-16
src_alpha2: de
tgt_alpha2: eu
prefer_old: False
long_pair: deu-eus
helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
port_machine: brutasse
port_time: 2020-08-21-14:41