Gabriele Sarti commited on
Commit
cc40a74
1 Parent(s): abdcc27

Initial commit

Browse files
README.md CHANGED
@@ -1,3 +1,44 @@
1
  ---
2
- license: cc-by-nc-sa-4.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language:
3
+ - en
4
+ - nl
5
+ tags:
6
+ - translation
7
+ - opus-mt-tc
8
+ license: cc-by-4.0
9
+ model-index:
10
+ - name: opus-mt-tc-base-en-nl
11
+ results:
12
+ - task:
13
+ name: Translation eng-mld
14
+ type: translation
15
+ args: eng-mld
16
+ dataset:
17
+ name: tatoeba-test-v2021-08-07
18
+ type: tatoeba_mt
19
+ args: eng-mld
20
+ metrics:
21
+ - name: BLEU
22
+ type: bleu
23
+ value: 57.5
24
  ---
25
+
26
+ # Opus Tatoeba English-Dutch
27
+
28
+ *This model was obtained by running the script [convert_marian_to_pytorch.py](https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py). The original models were trained by [Jörg Tiedemann](https://blogs.helsinki.fi/tiedeman/) using the [MarianNMT](https://marian-nmt.github.io/) library. See all available `MarianMTModel` models on the profile of the [Helsinki NLP](https://huggingface.co/Helsinki-NLP) group.*
29
+
30
+ * dataset: opus+bt
31
+ * model: transformer-align
32
+ * source language(s): eng
33
+ * target language(s): nld
34
+ * model: transformer-align
35
+ * pre-processing: normalization + SentencePiece (spm32k,spm32k)
36
+ * download: [opus+bt-2021-04-14.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.zip)
37
+ * test set translations: [opus+bt-2021-04-14.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.test.txt)
38
+ * test set scores: [opus+bt-2021-04-14.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-nld/opus+bt-2021-04-14.eval.txt)
39
+
40
+ ## Benchmarks
41
+
42
+ | testset | BLEU | chr-F | #sent | #words | BP |
43
+ |---------|-------|-------|-------|--------|----|
44
+ | Tatoeba-test.eng-nld | 57.5 | 0.731 | 10000 | 71436 | 0.986 |
config.json ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "swish",
4
+ "architectures": [
5
+ "MarianMTModel"
6
+ ],
7
+ "attention_dropout": 0.0,
8
+ "bad_words_ids": [
9
+ [
10
+ 56521
11
+ ]
12
+ ],
13
+ "bos_token_id": 0,
14
+ "classifier_dropout": 0.0,
15
+ "d_model": 512,
16
+ "decoder_attention_heads": 8,
17
+ "decoder_ffn_dim": 2048,
18
+ "decoder_layerdrop": 0.0,
19
+ "decoder_layers": 6,
20
+ "decoder_start_token_id": 56521,
21
+ "decoder_vocab_size": 56522,
22
+ "dropout": 0.1,
23
+ "encoder_attention_heads": 8,
24
+ "encoder_ffn_dim": 2048,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 6,
27
+ "eos_token_id": 0,
28
+ "forced_eos_token_id": 0,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "max_length": 512,
32
+ "max_position_embeddings": 512,
33
+ "model_type": "marian",
34
+ "normalize_embedding": false,
35
+ "num_beams": 6,
36
+ "num_hidden_layers": 6,
37
+ "pad_token_id": 56521,
38
+ "scale_embedding": true,
39
+ "share_encoder_decoder_embeddings": true,
40
+ "static_position_embeddings": true,
41
+ "torch_dtype": "float16",
42
+ "transformers_version": "4.21.0.dev0",
43
+ "use_cache": true,
44
+ "vocab_size": 56522
45
+ }
opus+bt.spm32k-spm32k.transformer-align.train1.log ADDED
The diff for this file is too large to render. See raw diff
 
opus+bt.spm32k-spm32k.transformer-align.valid1.log ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [2021-03-28 10:42:46] [valid] Ep. 1 : Up. 10000 : perplexity : 1.88848 : new best
2
+ [2021-03-28 12:48:11] [valid] Ep. 1 : Up. 20000 : perplexity : 1.91167 : stalled 1 times (last best: 1.88848)
3
+ [2021-03-28 14:53:16] [valid] Ep. 1 : Up. 30000 : perplexity : 1.90984 : stalled 2 times (last best: 1.88848)
4
+ [2021-03-28 16:58:21] [valid] Ep. 1 : Up. 40000 : perplexity : 1.90168 : stalled 3 times (last best: 1.88848)
5
+ [2021-03-28 19:03:54] [valid] Ep. 1 : Up. 50000 : perplexity : 1.90109 : stalled 4 times (last best: 1.88848)
6
+ [2021-03-28 21:09:42] [valid] Ep. 1 : Up. 60000 : perplexity : 1.90035 : stalled 5 times (last best: 1.88848)
7
+ [2021-04-02 10:09:01] [valid] Ep. 1 : Up. 70000 : perplexity : 1.89669 : stalled 5 times (last best: 1.88848)
8
+ [2021-04-05 19:03:37] [valid] Ep. 1 : Up. 80000 : perplexity : 1.89474 : stalled 5 times (last best: 1.88848)
9
+ [2021-04-06 00:54:33] [valid] Ep. 2 : Up. 90000 : perplexity : 1.89394 : stalled 5 times (last best: 1.88848)
10
+ [2021-04-06 03:00:01] [valid] Ep. 2 : Up. 100000 : perplexity : 1.8925 : stalled 6 times (last best: 1.88848)
11
+ [2021-04-06 05:05:17] [valid] Ep. 2 : Up. 110000 : perplexity : 1.89135 : stalled 7 times (last best: 1.88848)
12
+ [2021-04-06 07:10:24] [valid] Ep. 2 : Up. 120000 : perplexity : 1.88845 : new best
13
+ [2021-04-06 09:15:05] [valid] Ep. 2 : Up. 130000 : perplexity : 1.88556 : new best
14
+ [2021-04-06 11:20:22] [valid] Ep. 2 : Up. 140000 : perplexity : 1.88312 : new best
15
+ [2021-04-06 13:25:34] [valid] Ep. 2 : Up. 150000 : perplexity : 1.88199 : new best
16
+ [2021-04-06 15:31:01] [valid] Ep. 2 : Up. 160000 : perplexity : 1.88169 : new best
17
+ [2021-04-06 17:36:07] [valid] Ep. 2 : Up. 170000 : perplexity : 1.88365 : stalled 1 times (last best: 1.88169)
18
+ [2021-04-06 19:42:55] [valid] Ep. 3 : Up. 180000 : perplexity : 1.88647 : stalled 2 times (last best: 1.88169)
19
+ [2021-04-06 21:48:36] [valid] Ep. 3 : Up. 190000 : perplexity : 1.88655 : stalled 3 times (last best: 1.88169)
20
+ [2021-04-06 23:53:32] [valid] Ep. 3 : Up. 200000 : perplexity : 1.88541 : stalled 4 times (last best: 1.88169)
21
+ [2021-04-07 01:58:58] [valid] Ep. 3 : Up. 210000 : perplexity : 1.88435 : stalled 5 times (last best: 1.88169)
22
+ [2021-04-07 04:03:56] [valid] Ep. 3 : Up. 220000 : perplexity : 1.8836 : stalled 6 times (last best: 1.88169)
23
+ [2021-04-07 06:09:25] [valid] Ep. 3 : Up. 230000 : perplexity : 1.88213 : stalled 7 times (last best: 1.88169)
24
+ [2021-04-07 08:14:40] [valid] Ep. 3 : Up. 240000 : perplexity : 1.88113 : new best
25
+ [2021-04-07 10:19:54] [valid] Ep. 3 : Up. 250000 : perplexity : 1.87992 : new best
26
+ [2021-04-07 12:25:19] [valid] Ep. 3 : Up. 260000 : perplexity : 1.87875 : new best
27
+ [2021-04-07 14:31:43] [valid] Ep. 4 : Up. 270000 : perplexity : 1.87816 : new best
28
+ [2021-04-07 16:36:44] [valid] Ep. 4 : Up. 280000 : perplexity : 1.87835 : stalled 1 times (last best: 1.87816)
29
+ [2021-04-07 18:41:26] [valid] Ep. 4 : Up. 290000 : perplexity : 1.87794 : new best
30
+ [2021-04-07 20:46:47] [valid] Ep. 4 : Up. 300000 : perplexity : 1.87728 : new best
31
+ [2021-04-07 22:52:07] [valid] Ep. 4 : Up. 310000 : perplexity : 1.8775 : stalled 1 times (last best: 1.87728)
32
+ [2021-04-08 00:57:43] [valid] Ep. 4 : Up. 320000 : perplexity : 1.87683 : new best
33
+ [2021-04-08 03:03:13] [valid] Ep. 4 : Up. 330000 : perplexity : 1.8766 : new best
34
+ [2021-04-08 05:08:27] [valid] Ep. 4 : Up. 340000 : perplexity : 1.87584 : new best
35
+ [2021-04-08 07:13:40] [valid] Ep. 4 : Up. 350000 : perplexity : 1.87712 : stalled 1 times (last best: 1.87584)
36
+ [2021-04-08 09:20:38] [valid] Ep. 5 : Up. 360000 : perplexity : 1.87757 : stalled 2 times (last best: 1.87584)
37
+ [2021-04-08 11:25:42] [valid] Ep. 5 : Up. 370000 : perplexity : 1.87697 : stalled 3 times (last best: 1.87584)
38
+ [2021-04-08 13:31:25] [valid] Ep. 5 : Up. 380000 : perplexity : 1.87656 : stalled 4 times (last best: 1.87584)
39
+ [2021-04-08 15:36:41] [valid] Ep. 5 : Up. 390000 : perplexity : 1.87554 : new best
40
+ [2021-04-08 17:42:17] [valid] Ep. 5 : Up. 400000 : perplexity : 1.87514 : new best
41
+ [2021-04-08 19:47:39] [valid] Ep. 5 : Up. 410000 : perplexity : 1.87363 : new best
42
+ [2021-04-08 21:52:52] [valid] Ep. 5 : Up. 420000 : perplexity : 1.87405 : stalled 1 times (last best: 1.87363)
43
+ [2021-04-10 21:44:08] [valid] Ep. 5 : Up. 430000 : perplexity : 1.8733 : new best
44
+ [2021-04-10 23:48:53] [valid] Ep. 5 : Up. 440000 : perplexity : 1.87317 : new best
45
+ [2021-04-11 01:55:54] [valid] Ep. 6 : Up. 450000 : perplexity : 1.87329 : stalled 1 times (last best: 1.87317)
46
+ [2021-04-11 04:01:02] [valid] Ep. 6 : Up. 460000 : perplexity : 1.87395 : stalled 2 times (last best: 1.87317)
47
+ [2021-04-11 06:06:09] [valid] Ep. 6 : Up. 470000 : perplexity : 1.87372 : stalled 3 times (last best: 1.87317)
48
+ [2021-04-11 08:11:32] [valid] Ep. 6 : Up. 480000 : perplexity : 1.87407 : stalled 4 times (last best: 1.87317)
49
+ [2021-04-11 10:16:41] [valid] Ep. 6 : Up. 490000 : perplexity : 1.87471 : stalled 5 times (last best: 1.87317)
50
+ [2021-04-11 12:21:56] [valid] Ep. 6 : Up. 500000 : perplexity : 1.87397 : stalled 6 times (last best: 1.87317)
51
+ [2021-04-11 14:27:27] [valid] Ep. 6 : Up. 510000 : perplexity : 1.87282 : new best
52
+ [2021-04-11 16:32:34] [valid] Ep. 6 : Up. 520000 : perplexity : 1.8736 : stalled 1 times (last best: 1.87282)
53
+ [2021-04-11 18:37:34] [valid] Ep. 6 : Up. 530000 : perplexity : 1.87476 : stalled 2 times (last best: 1.87282)
54
+ [2021-04-11 20:44:23] [valid] Ep. 7 : Up. 540000 : perplexity : 1.87514 : stalled 3 times (last best: 1.87282)
55
+ [2021-04-11 22:49:35] [valid] Ep. 7 : Up. 550000 : perplexity : 1.87457 : stalled 4 times (last best: 1.87282)
56
+ [2021-04-12 00:54:35] [valid] Ep. 7 : Up. 560000 : perplexity : 1.87461 : stalled 5 times (last best: 1.87282)
57
+ [2021-04-12 03:00:08] [valid] Ep. 7 : Up. 570000 : perplexity : 1.87325 : stalled 6 times (last best: 1.87282)
58
+ [2021-04-12 05:05:07] [valid] Ep. 7 : Up. 580000 : perplexity : 1.87299 : stalled 7 times (last best: 1.87282)
59
+ [2021-04-12 07:10:26] [valid] Ep. 7 : Up. 590000 : perplexity : 1.87319 : stalled 8 times (last best: 1.87282)
60
+ [2021-04-12 09:15:27] [valid] Ep. 7 : Up. 600000 : perplexity : 1.87386 : stalled 9 times (last best: 1.87282)
61
+ [2021-04-12 11:20:27] [valid] Ep. 7 : Up. 610000 : perplexity : 1.873 : stalled 10 times (last best: 1.87282)
62
+ [2021-04-12 13:27:10] [valid] Ep. 8 : Up. 620000 : perplexity : 1.87256 : new best
63
+ [2021-04-12 15:32:05] [valid] Ep. 8 : Up. 630000 : perplexity : 1.87236 : new best
64
+ [2021-04-12 17:37:08] [valid] Ep. 8 : Up. 640000 : perplexity : 1.87305 : stalled 1 times (last best: 1.87236)
65
+ [2021-04-12 19:42:52] [valid] Ep. 8 : Up. 650000 : perplexity : 1.87328 : stalled 2 times (last best: 1.87236)
66
+ [2021-04-12 21:48:11] [valid] Ep. 8 : Up. 660000 : perplexity : 1.8724 : stalled 3 times (last best: 1.87236)
67
+ [2021-04-12 23:53:17] [valid] Ep. 8 : Up. 670000 : perplexity : 1.87285 : stalled 4 times (last best: 1.87236)
68
+ [2021-04-13 01:58:19] [valid] Ep. 8 : Up. 680000 : perplexity : 1.87281 : stalled 5 times (last best: 1.87236)
69
+ [2021-04-13 04:03:27] [valid] Ep. 8 : Up. 690000 : perplexity : 1.87234 : new best
70
+ [2021-04-13 06:09:09] [valid] Ep. 8 : Up. 700000 : perplexity : 1.87199 : new best
71
+ [2021-04-13 08:15:24] [valid] Ep. 9 : Up. 710000 : perplexity : 1.87139 : new best
72
+ [2021-04-13 10:20:46] [valid] Ep. 9 : Up. 720000 : perplexity : 1.87127 : new best
73
+ [2021-04-13 12:25:56] [valid] Ep. 9 : Up. 730000 : perplexity : 1.87003 : new best
74
+ [2021-04-13 14:31:40] [valid] Ep. 9 : Up. 740000 : perplexity : 1.87066 : stalled 1 times (last best: 1.87003)
75
+ [2021-04-13 16:36:33] [valid] Ep. 9 : Up. 750000 : perplexity : 1.87086 : stalled 2 times (last best: 1.87003)
76
+ [2021-04-13 18:41:41] [valid] Ep. 9 : Up. 760000 : perplexity : 1.87066 : stalled 3 times (last best: 1.87003)
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f5993c67efd07da6dba22f63c42bc3ca58117661e5446f9bde8cd5cc6aa2f3a5
3
+ size 204228163
source.spm ADDED
Binary file (782 kB). View file
 
special_tokens_map.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "eos_token": "</s>",
3
+ "pad_token": "<pad>",
4
+ "unk_token": "<unk>"
5
+ }
target.spm ADDED
Binary file (800 kB). View file
 
tokenizer_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "eos_token": "</s>",
3
+ "model_max_length": 512,
4
+ "name_or_path": "models/opus-mt-tc-base-en-nl",
5
+ "pad_token": "<pad>",
6
+ "separate_vocabs": false,
7
+ "source_lang": "opus-mt-tc-base-en",
8
+ "sp_model_kwargs": {},
9
+ "special_tokens_map_file": null,
10
+ "target_lang": "nl",
11
+ "tokenizer_class": "MarianTokenizer",
12
+ "tokenizer_file": null,
13
+ "unk_token": "<unk>"
14
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff