patrickvonplaten's picture
Add `opus-mt-tc` tag (#1)
d30722f
metadata
language:
  - bg
  - en
tags:
  - translation
  - opus-mt-tc
license: cc-by-4.0
model-index:
  - name: opus-mt-tc-big-bg-en
    results:
      - task:
          name: Translation bul-eng
          type: translation
          args: bul-eng
        dataset:
          name: flores101-devtest
          type: flores_101
          args: bul eng devtest
        metrics:
          - name: BLEU
            type: bleu
            value: 42.9
      - task:
          name: Translation bul-eng
          type: translation
          args: bul-eng
        dataset:
          name: tatoeba-test-v2021-08-07
          type: tatoeba_mt
          args: bul-eng
        metrics:
          - name: BLEU
            type: bleu
            value: 60.5

opus-mt-tc-big-bg-en

Neural machine translation model for translating from Bulgarian (bg) to English (en).

This model is part of the OPUS-MT project, an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of Marian NMT, an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from OPUS and training pipelines use the procedures of OPUS-MT-train.

@inproceedings{tiedemann-thottingal-2020-opus,
    title = "{OPUS}-{MT} {--} Building open translation services for the World",
    author = {Tiedemann, J{\"o}rg  and Thottingal, Santhosh},
    booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
    month = nov,
    year = "2020",
    address = "Lisboa, Portugal",
    publisher = "European Association for Machine Translation",
    url = "https://aclanthology.org/2020.eamt-1.61",
    pages = "479--480",
}

@inproceedings{tiedemann-2020-tatoeba,
    title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
    author = {Tiedemann, J{\"o}rg},
    booktitle = "Proceedings of the Fifth Conference on Machine Translation",
    month = nov,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2020.wmt-1.139",
    pages = "1174--1182",
}

Model info

Usage

A short example code:

from transformers import MarianMTModel, MarianTokenizer

src_text = [
    "2001 е годината, с която започва 21-ви век.",
    "Това е Copacabana!"
]

model_name = "pytorch-models/opus-mt-tc-big-bg-en"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))

for t in translated:
    print( tokenizer.decode(t, skip_special_tokens=True) )

# expected output:
#     2001 was the year the 21st century began.
#     It's Copacabana!

You can also use OPUS-MT models with the transformers pipelines, for example:

from transformers import pipeline
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-bg-en")
print(pipe("2001 е годината, с която започва 21-ви век."))

# expected output: 2001 was the year the 21st century began.

Benchmarks

langpair testset chr-F BLEU #sent #words
bul-eng tatoeba-test-v2021-08-07 0.73687 60.5 10000 71872
bul-eng flores101-devtest 0.67938 42.9 1012 24721

Acknowledgements

The work is supported by the European Language Grid as pilot project 2866, by the FoTran project, funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 771113), and the MeMAD project, funded by the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by CSC -- IT Center for Science, Finland.

Model conversion info

  • transformers version: 4.16.2
  • OPUS-MT git hash: 3405783
  • port time: Wed Apr 13 18:23:56 EEST 2022
  • port machine: LM0-400-22516.local