language: | |
- en | |
- gl | |
tags: | |
- translation | |
license: cc-by-4.0 | |
### Translation model for en-gl OPUS_HPLT v1.0 | |
This repository contains the model weights for translation models trained with Marian for HPLT project. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository. | |
* Source language: en | |
* Target language: gl | |
* Dataset: All of OPUS including HPLT | |
* Model: transformer-base | |
* Tokenizer: SentencePiece (Unigram) | |
* Cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/gl-en/raw/v2) | |
To run inference with Marian, refer to the [Inference/Decoding/Translation](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0#inferencedecodingtranslation) section of our GitHub repository. | |
## Benchmarks | |
| testset | BLEU | chr-F | COMET-22 | | |
| -------------------------------------- | ---- | ----- | ----- | | |
| flores200 | 32.5 | 57.5 | 0.842 | | |
| ntrex | 32.7 | 56.8 | 0.7962 | | |