File size: 899 Bytes
f656e2c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
tags:
- translation
license: cc0-1.0
---
### en-mt HPLT v1.0
Note: This repository only contains the model weights. For usage instructions, evaluation scripts, and inference scripts, please refer to the [HPLT-MT-Models v1.0](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0) GitHub repository.
* source language: en
* target language: mt
* dataset: OPUS + HPLTDatasets v1.2
* model: transformer-base
* tokenizer: SentencePiece (Unigram)
* cleaning: We use OpusCleaner for cleaning the corpus. Details about rules used can be found in the filter files in [Github](https://github.com/hplt-project/HPLT-MT-Models/tree/main/v1.0/data/en-mt/raw/v2)
## Benchmarks
| testset | BLEU | chr-F | comet |
| -------------------------------------- | ---- | ----- | ----- |
| flores200.en.mt | 47.5 | 0.64 | 0.64 |
| ntrex.en.mt | 25 | 0.62 | 0.62 | |