opus-mt-tc-base-en-ru / opus+bt.spm32k-spm32k.transformer-align.train1.log
Gabriele Sarti
Initial commit
0fcb40e
[2021-03-28 22:24:02] [marian] Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-03-28 22:24:02] [marian] Running on r18g02.bullx as process 135320 with command line:
[2021-03-28 22:24:02] [marian] /projappl/project_2001194/marian/build/marian --guided-alignment /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz --early-stopping 15 --valid-freq 10000 --valid-sets /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k --valid-metrics perplexity --valid-mini-batch 16 --valid-log /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log --beam-size 12 --normalize 1 --allow-unk --overwrite --keep-best --model /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz --type transformer --train-sets /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz --max-length 500 --vocabs /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml --mini-batch-fit -w 24000 --maxi-batch 500 --save-freq 10000 --disp-freq 10000 --log /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log --enc-depth 6 --dec-depth 6 --transformer-heads 8 --transformer-postprocess-emb d --transformer-postprocess dan --transformer-dropout 0.1 --label-smoothing 0.1 --learn-rate 0.0003 --lr-warmup 16000 --lr-decay-inv-sqrt 16000 --lr-report --optimizer-params 0.9 0.98 1e-09 --clip-norm 5 --fp16 --tied-embeddings-all --devices 0 1 2 3 --sync-sgd --seed 1111 --sqlite --tempdir /run/nvme/job_5338591/data --exponential-smoothing
[2021-03-28 22:24:02] [config] after: 0e
[2021-03-28 22:24:02] [config] after-batches: 0
[2021-03-28 22:24:02] [config] after-epochs: 0
[2021-03-28 22:24:02] [config] all-caps-every: 0
[2021-03-28 22:24:02] [config] allow-unk: true
[2021-03-28 22:24:02] [config] authors: false
[2021-03-28 22:24:02] [config] beam-size: 12
[2021-03-28 22:24:02] [config] bert-class-symbol: "[CLS]"
[2021-03-28 22:24:02] [config] bert-mask-symbol: "[MASK]"
[2021-03-28 22:24:02] [config] bert-masking-fraction: 0.15
[2021-03-28 22:24:02] [config] bert-sep-symbol: "[SEP]"
[2021-03-28 22:24:02] [config] bert-train-type-embeddings: true
[2021-03-28 22:24:02] [config] bert-type-vocab-size: 2
[2021-03-28 22:24:02] [config] build-info: ""
[2021-03-28 22:24:02] [config] cite: false
[2021-03-28 22:24:02] [config] clip-norm: 5
[2021-03-28 22:24:02] [config] cost-scaling:
[2021-03-28 22:24:02] [config] - 7
[2021-03-28 22:24:02] [config] - 2000
[2021-03-28 22:24:02] [config] - 2
[2021-03-28 22:24:02] [config] - 0.05
[2021-03-28 22:24:02] [config] - 10
[2021-03-28 22:24:02] [config] - 1
[2021-03-28 22:24:02] [config] cost-type: ce-sum
[2021-03-28 22:24:02] [config] cpu-threads: 0
[2021-03-28 22:24:02] [config] data-weighting: ""
[2021-03-28 22:24:02] [config] data-weighting-type: sentence
[2021-03-28 22:24:02] [config] dec-cell: gru
[2021-03-28 22:24:02] [config] dec-cell-base-depth: 2
[2021-03-28 22:24:02] [config] dec-cell-high-depth: 1
[2021-03-28 22:24:02] [config] dec-depth: 6
[2021-03-28 22:24:02] [config] devices:
[2021-03-28 22:24:02] [config] - 0
[2021-03-28 22:24:02] [config] - 1
[2021-03-28 22:24:02] [config] - 2
[2021-03-28 22:24:02] [config] - 3
[2021-03-28 22:24:02] [config] dim-emb: 512
[2021-03-28 22:24:02] [config] dim-rnn: 1024
[2021-03-28 22:24:02] [config] dim-vocabs:
[2021-03-28 22:24:02] [config] - 0
[2021-03-28 22:24:02] [config] - 0
[2021-03-28 22:24:02] [config] disp-first: 0
[2021-03-28 22:24:02] [config] disp-freq: 10000
[2021-03-28 22:24:02] [config] disp-label-counts: true
[2021-03-28 22:24:02] [config] dropout-rnn: 0
[2021-03-28 22:24:02] [config] dropout-src: 0
[2021-03-28 22:24:02] [config] dropout-trg: 0
[2021-03-28 22:24:02] [config] dump-config: ""
[2021-03-28 22:24:02] [config] early-stopping: 15
[2021-03-28 22:24:02] [config] embedding-fix-src: false
[2021-03-28 22:24:02] [config] embedding-fix-trg: false
[2021-03-28 22:24:02] [config] embedding-normalization: false
[2021-03-28 22:24:02] [config] embedding-vectors:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] enc-cell: gru
[2021-03-28 22:24:02] [config] enc-cell-depth: 1
[2021-03-28 22:24:02] [config] enc-depth: 6
[2021-03-28 22:24:02] [config] enc-type: bidirectional
[2021-03-28 22:24:02] [config] english-title-case-every: 0
[2021-03-28 22:24:02] [config] exponential-smoothing: 0.0001
[2021-03-28 22:24:02] [config] factor-weight: 1
[2021-03-28 22:24:02] [config] grad-dropping-momentum: 0
[2021-03-28 22:24:02] [config] grad-dropping-rate: 0
[2021-03-28 22:24:02] [config] grad-dropping-warmup: 100
[2021-03-28 22:24:02] [config] gradient-checkpointing: false
[2021-03-28 22:24:02] [config] guided-alignment: /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-03-28 22:24:02] [config] guided-alignment-cost: mse
[2021-03-28 22:24:02] [config] guided-alignment-weight: 0.1
[2021-03-28 22:24:02] [config] ignore-model-config: false
[2021-03-28 22:24:02] [config] input-types:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] interpolate-env-vars: false
[2021-03-28 22:24:02] [config] keep-best: true
[2021-03-28 22:24:02] [config] label-smoothing: 0.1
[2021-03-28 22:24:02] [config] layer-normalization: false
[2021-03-28 22:24:02] [config] learn-rate: 0.0003
[2021-03-28 22:24:02] [config] lemma-dim-emb: 0
[2021-03-28 22:24:02] [config] log: /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log
[2021-03-28 22:24:02] [config] log-level: info
[2021-03-28 22:24:02] [config] log-time-zone: ""
[2021-03-28 22:24:02] [config] logical-epoch:
[2021-03-28 22:24:02] [config] - 1e
[2021-03-28 22:24:02] [config] - 0
[2021-03-28 22:24:02] [config] lr-decay: 0
[2021-03-28 22:24:02] [config] lr-decay-freq: 50000
[2021-03-28 22:24:02] [config] lr-decay-inv-sqrt:
[2021-03-28 22:24:02] [config] - 16000
[2021-03-28 22:24:02] [config] lr-decay-repeat-warmup: false
[2021-03-28 22:24:02] [config] lr-decay-reset-optimizer: false
[2021-03-28 22:24:02] [config] lr-decay-start:
[2021-03-28 22:24:02] [config] - 10
[2021-03-28 22:24:02] [config] - 1
[2021-03-28 22:24:02] [config] lr-decay-strategy: epoch+stalled
[2021-03-28 22:24:02] [config] lr-report: true
[2021-03-28 22:24:02] [config] lr-warmup: 16000
[2021-03-28 22:24:02] [config] lr-warmup-at-reload: false
[2021-03-28 22:24:02] [config] lr-warmup-cycle: false
[2021-03-28 22:24:02] [config] lr-warmup-start-rate: 0
[2021-03-28 22:24:02] [config] max-length: 500
[2021-03-28 22:24:02] [config] max-length-crop: false
[2021-03-28 22:24:02] [config] max-length-factor: 3
[2021-03-28 22:24:02] [config] maxi-batch: 500
[2021-03-28 22:24:02] [config] maxi-batch-sort: trg
[2021-03-28 22:24:02] [config] mini-batch: 64
[2021-03-28 22:24:02] [config] mini-batch-fit: true
[2021-03-28 22:24:02] [config] mini-batch-fit-step: 10
[2021-03-28 22:24:02] [config] mini-batch-track-lr: false
[2021-03-28 22:24:02] [config] mini-batch-warmup: 0
[2021-03-28 22:24:02] [config] mini-batch-words: 0
[2021-03-28 22:24:02] [config] mini-batch-words-ref: 0
[2021-03-28 22:24:02] [config] model: /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-03-28 22:24:02] [config] multi-loss-type: sum
[2021-03-28 22:24:02] [config] multi-node: false
[2021-03-28 22:24:02] [config] multi-node-overlap: true
[2021-03-28 22:24:02] [config] n-best: false
[2021-03-28 22:24:02] [config] no-nccl: false
[2021-03-28 22:24:02] [config] no-reload: false
[2021-03-28 22:24:02] [config] no-restore-corpus: false
[2021-03-28 22:24:02] [config] normalize: 1
[2021-03-28 22:24:02] [config] normalize-gradient: false
[2021-03-28 22:24:02] [config] num-devices: 0
[2021-03-28 22:24:02] [config] optimizer: adam
[2021-03-28 22:24:02] [config] optimizer-delay: 1
[2021-03-28 22:24:02] [config] optimizer-params:
[2021-03-28 22:24:02] [config] - 0.9
[2021-03-28 22:24:02] [config] - 0.98
[2021-03-28 22:24:02] [config] - 1e-09
[2021-03-28 22:24:02] [config] output-omit-bias: false
[2021-03-28 22:24:02] [config] overwrite: true
[2021-03-28 22:24:02] [config] precision:
[2021-03-28 22:24:02] [config] - float16
[2021-03-28 22:24:02] [config] - float32
[2021-03-28 22:24:02] [config] - float32
[2021-03-28 22:24:02] [config] pretrained-model: ""
[2021-03-28 22:24:02] [config] quantize-biases: false
[2021-03-28 22:24:02] [config] quantize-bits: 0
[2021-03-28 22:24:02] [config] quantize-log-based: false
[2021-03-28 22:24:02] [config] quantize-optimization-steps: 0
[2021-03-28 22:24:02] [config] quiet: false
[2021-03-28 22:24:02] [config] quiet-translation: false
[2021-03-28 22:24:02] [config] relative-paths: false
[2021-03-28 22:24:02] [config] right-left: false
[2021-03-28 22:24:02] [config] save-freq: 10000
[2021-03-28 22:24:02] [config] seed: 1111
[2021-03-28 22:24:02] [config] sentencepiece-alphas:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] sentencepiece-max-lines: 2000000
[2021-03-28 22:24:02] [config] sentencepiece-options: ""
[2021-03-28 22:24:02] [config] shuffle: data
[2021-03-28 22:24:02] [config] shuffle-in-ram: false
[2021-03-28 22:24:02] [config] sigterm: save-and-exit
[2021-03-28 22:24:02] [config] skip: false
[2021-03-28 22:24:02] [config] sqlite: temporary
[2021-03-28 22:24:02] [config] sqlite-drop: false
[2021-03-28 22:24:02] [config] sync-sgd: true
[2021-03-28 22:24:02] [config] tempdir: /run/nvme/job_5338591/data
[2021-03-28 22:24:02] [config] tied-embeddings: false
[2021-03-28 22:24:02] [config] tied-embeddings-all: true
[2021-03-28 22:24:02] [config] tied-embeddings-src: false
[2021-03-28 22:24:02] [config] train-embedder-rank:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] train-sets:
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz
[2021-03-28 22:24:02] [config] transformer-aan-activation: swish
[2021-03-28 22:24:02] [config] transformer-aan-depth: 2
[2021-03-28 22:24:02] [config] transformer-aan-nogate: false
[2021-03-28 22:24:02] [config] transformer-decoder-autoreg: self-attention
[2021-03-28 22:24:02] [config] transformer-depth-scaling: false
[2021-03-28 22:24:02] [config] transformer-dim-aan: 2048
[2021-03-28 22:24:02] [config] transformer-dim-ffn: 2048
[2021-03-28 22:24:02] [config] transformer-dropout: 0.1
[2021-03-28 22:24:02] [config] transformer-dropout-attention: 0
[2021-03-28 22:24:02] [config] transformer-dropout-ffn: 0
[2021-03-28 22:24:02] [config] transformer-ffn-activation: swish
[2021-03-28 22:24:02] [config] transformer-ffn-depth: 2
[2021-03-28 22:24:02] [config] transformer-guided-alignment-layer: last
[2021-03-28 22:24:02] [config] transformer-heads: 8
[2021-03-28 22:24:02] [config] transformer-no-projection: false
[2021-03-28 22:24:02] [config] transformer-pool: false
[2021-03-28 22:24:02] [config] transformer-postprocess: dan
[2021-03-28 22:24:02] [config] transformer-postprocess-emb: d
[2021-03-28 22:24:02] [config] transformer-postprocess-top: ""
[2021-03-28 22:24:02] [config] transformer-preprocess: ""
[2021-03-28 22:24:02] [config] transformer-tied-layers:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] transformer-train-position-embeddings: false
[2021-03-28 22:24:02] [config] tsv: false
[2021-03-28 22:24:02] [config] tsv-fields: 0
[2021-03-28 22:24:02] [config] type: transformer
[2021-03-28 22:24:02] [config] ulr: false
[2021-03-28 22:24:02] [config] ulr-dim-emb: 0
[2021-03-28 22:24:02] [config] ulr-dropout: 0
[2021-03-28 22:24:02] [config] ulr-keys-vectors: ""
[2021-03-28 22:24:02] [config] ulr-query-vectors: ""
[2021-03-28 22:24:02] [config] ulr-softmax-temperature: 1
[2021-03-28 22:24:02] [config] ulr-trainable-transformation: false
[2021-03-28 22:24:02] [config] unlikelihood-loss: false
[2021-03-28 22:24:02] [config] valid-freq: 10000
[2021-03-28 22:24:02] [config] valid-log: /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log
[2021-03-28 22:24:02] [config] valid-max-length: 1000
[2021-03-28 22:24:02] [config] valid-metrics:
[2021-03-28 22:24:02] [config] - perplexity
[2021-03-28 22:24:02] [config] valid-mini-batch: 16
[2021-03-28 22:24:02] [config] valid-reset-stalled: false
[2021-03-28 22:24:02] [config] valid-script-args:
[2021-03-28 22:24:02] [config] []
[2021-03-28 22:24:02] [config] valid-script-path: ""
[2021-03-28 22:24:02] [config] valid-sets:
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k
[2021-03-28 22:24:02] [config] valid-translation-output: ""
[2021-03-28 22:24:02] [config] vocabs:
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-03-28 22:24:02] [config] - /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-03-28 22:24:02] [config] word-penalty: 0
[2021-03-28 22:24:02] [config] word-scores: false
[2021-03-28 22:24:02] [config] workspace: 24000
[2021-03-28 22:24:02] [config] Model is being created with Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-03-28 22:24:02] Using synchronous SGD
[2021-03-28 22:24:02] [data] Loading vocabulary from JSON/Yaml file /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-03-28 22:24:03] [data] Setting vocabulary size for input 0 to 60,878
[2021-03-28 22:24:03] [data] Loading vocabulary from JSON/Yaml file /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-03-28 22:24:03] [data] Setting vocabulary size for input 1 to 60,878
[2021-03-28 22:24:03] [data] Using word alignments from file /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-03-28 22:24:03] [sqlite] Creating temporary database in /run/nvme/job_5338591/data
[2021-03-28 22:24:07] [sqlite] Inserted 1000000 lines
[2021-03-28 22:24:11] [sqlite] Inserted 2000000 lines
[2021-03-28 22:24:19] [sqlite] Inserted 4000000 lines
[2021-03-28 22:24:35] [sqlite] Inserted 8000000 lines
[2021-03-28 22:25:07] [sqlite] Inserted 16000000 lines
[2021-03-28 22:26:09] [sqlite] Inserted 32000000 lines
[2021-03-28 22:27:40] [sqlite] Inserted 64000000 lines
[2021-03-28 22:29:22] [sqlite] Inserted 90910822 lines
[2021-03-28 22:29:22] [sqlite] Creating primary index
[2021-03-28 22:30:09] [comm] Compiled without MPI support. Running as a single process on r18g02.bullx
[2021-03-28 22:30:09] [batching] Collecting statistics for batch fitting with step size 10
[2021-03-28 22:30:18] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-03-28 22:30:19] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-03-28 22:30:19] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-03-28 22:30:20] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-03-28 22:30:20] [comm] Using NCCL 2.8.3 for GPU communication
[2021-03-28 22:30:21] [comm] NCCLCommunicator constructed successfully
[2021-03-28 22:30:21] [training] Using 4 GPUs
[2021-03-28 22:30:22] [logits] Applying loss function for 1 factor(s)
[2021-03-28 22:30:22] [memory] Reserving 287 MB, device gpu0
[2021-03-28 22:30:22] [gpu] 16-bit TensorCores enabled for float32 matrix operations
[2021-03-28 22:30:23] [memory] Reserving 287 MB, device gpu0
[2021-03-28 22:33:08] [batching] Done. Typical MB size is 57,739 target words
[2021-03-28 22:33:09] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-03-28 22:33:09] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-03-28 22:33:09] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-03-28 22:33:09] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-03-28 22:33:09] [comm] Using NCCL 2.8.3 for GPU communication
[2021-03-28 22:33:10] [comm] NCCLCommunicator constructed successfully
[2021-03-28 22:33:10] [training] Using 4 GPUs
[2021-03-28 22:33:10] Training started
[2021-03-28 22:33:10] [sqlite] Selecting shuffled data
[2021-03-28 22:35:02] [training] Batches are processed as 1 process(es) x 4 devices/process
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu0
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu1
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu2
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu3
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu0
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu2
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu1
[2021-03-28 22:35:02] [memory] Reserving 287 MB, device gpu3
[2021-03-28 22:35:02] [memory] Reserving 71 MB, device gpu0
[2021-03-28 22:35:02] [memory] Reserving 71 MB, device gpu1
[2021-03-28 22:35:02] [memory] Reserving 71 MB, device gpu2
[2021-03-28 22:35:02] [memory] Reserving 71 MB, device gpu3
[2021-03-28 22:35:02] [memory] Reserving 143 MB, device gpu0
[2021-03-28 22:35:02] [memory] Reserving 143 MB, device gpu3
[2021-03-28 22:35:02] [memory] Reserving 143 MB, device gpu2
[2021-03-28 22:35:02] [memory] Reserving 143 MB, device gpu1
[2021-03-29 00:40:05] Ep. 1 : Up. 10000 : Sen. 7,937,432 : Cost 0.76499343 * 1,539,648,423 @ 19,932 after 1,539,648,423 : Time 7616.30s : 25243.16 words/s : L.r. 1.8750e-04
[2021-03-29 00:40:05] Saving model weights and runtime parameters to /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-03-29 00:40:07] Saving model weights and runtime parameters to /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-03-29 00:40:09] Saving Adam parameters to /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-03-29 00:40:16] Saving model weights and runtime parameters to /scratch/project_2001194/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-03-29 00:40:17] [valid] Ep. 1 : Up. 10000 : perplexity : 6.61571 : new best
[2021-04-02 20:06:19] [marian] Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-02 20:06:19] [marian] Running on r03g02.bullx as process 131947 with command line:
[2021-04-02 20:06:19] [marian] /projappl/project_2001194/marian/build/marian --guided-alignment /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz --early-stopping 15 --valid-freq 10000 --valid-sets /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k --valid-metrics perplexity --valid-mini-batch 16 --valid-log /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log --beam-size 12 --normalize 1 --allow-unk --overwrite --keep-best --model /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz --type transformer --train-sets /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz --max-length 500 --vocabs /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml --mini-batch-fit -w 24000 --maxi-batch 500 --save-freq 10000 --disp-freq 10000 --log /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log --enc-depth 6 --dec-depth 6 --transformer-heads 8 --transformer-postprocess-emb d --transformer-postprocess dan --transformer-dropout 0.1 --label-smoothing 0.1 --learn-rate 0.0003 --lr-warmup 16000 --lr-decay-inv-sqrt 16000 --lr-report --optimizer-params 0.9 0.98 1e-09 --clip-norm 5 --fp16 --tied-embeddings-all --devices 0 1 2 3 --sync-sgd --seed 1111 --sqlite --tempdir /run/nvme/job_5393846/data --exponential-smoothing
[2021-04-02 20:06:20] [config] after: 0e
[2021-04-02 20:06:20] [config] after-batches: 0
[2021-04-02 20:06:20] [config] after-epochs: 0
[2021-04-02 20:06:20] [config] all-caps-every: 0
[2021-04-02 20:06:20] [config] allow-unk: true
[2021-04-02 20:06:20] [config] authors: false
[2021-04-02 20:06:20] [config] beam-size: 12
[2021-04-02 20:06:20] [config] bert-class-symbol: "[CLS]"
[2021-04-02 20:06:20] [config] bert-mask-symbol: "[MASK]"
[2021-04-02 20:06:20] [config] bert-masking-fraction: 0.15
[2021-04-02 20:06:20] [config] bert-sep-symbol: "[SEP]"
[2021-04-02 20:06:20] [config] bert-train-type-embeddings: true
[2021-04-02 20:06:20] [config] bert-type-vocab-size: 2
[2021-04-02 20:06:20] [config] build-info: ""
[2021-04-02 20:06:20] [config] cite: false
[2021-04-02 20:06:20] [config] clip-norm: 5
[2021-04-02 20:06:20] [config] cost-scaling:
[2021-04-02 20:06:20] [config] - 7
[2021-04-02 20:06:20] [config] - 2000
[2021-04-02 20:06:20] [config] - 2
[2021-04-02 20:06:20] [config] - 0.05
[2021-04-02 20:06:20] [config] - 10
[2021-04-02 20:06:20] [config] - 1
[2021-04-02 20:06:20] [config] cost-type: ce-sum
[2021-04-02 20:06:20] [config] cpu-threads: 0
[2021-04-02 20:06:20] [config] data-weighting: ""
[2021-04-02 20:06:20] [config] data-weighting-type: sentence
[2021-04-02 20:06:20] [config] dec-cell: gru
[2021-04-02 20:06:20] [config] dec-cell-base-depth: 2
[2021-04-02 20:06:20] [config] dec-cell-high-depth: 1
[2021-04-02 20:06:20] [config] dec-depth: 6
[2021-04-02 20:06:20] [config] devices:
[2021-04-02 20:06:20] [config] - 0
[2021-04-02 20:06:20] [config] - 1
[2021-04-02 20:06:20] [config] - 2
[2021-04-02 20:06:20] [config] - 3
[2021-04-02 20:06:20] [config] dim-emb: 512
[2021-04-02 20:06:20] [config] dim-rnn: 1024
[2021-04-02 20:06:20] [config] dim-vocabs:
[2021-04-02 20:06:20] [config] - 60878
[2021-04-02 20:06:20] [config] - 60878
[2021-04-02 20:06:20] [config] disp-first: 0
[2021-04-02 20:06:20] [config] disp-freq: 10000
[2021-04-02 20:06:20] [config] disp-label-counts: true
[2021-04-02 20:06:20] [config] dropout-rnn: 0
[2021-04-02 20:06:20] [config] dropout-src: 0
[2021-04-02 20:06:20] [config] dropout-trg: 0
[2021-04-02 20:06:20] [config] dump-config: ""
[2021-04-02 20:06:20] [config] early-stopping: 15
[2021-04-02 20:06:20] [config] embedding-fix-src: false
[2021-04-02 20:06:20] [config] embedding-fix-trg: false
[2021-04-02 20:06:20] [config] embedding-normalization: false
[2021-04-02 20:06:20] [config] embedding-vectors:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] enc-cell: gru
[2021-04-02 20:06:20] [config] enc-cell-depth: 1
[2021-04-02 20:06:20] [config] enc-depth: 6
[2021-04-02 20:06:20] [config] enc-type: bidirectional
[2021-04-02 20:06:20] [config] english-title-case-every: 0
[2021-04-02 20:06:20] [config] exponential-smoothing: 0.0001
[2021-04-02 20:06:20] [config] factor-weight: 1
[2021-04-02 20:06:20] [config] grad-dropping-momentum: 0
[2021-04-02 20:06:20] [config] grad-dropping-rate: 0
[2021-04-02 20:06:20] [config] grad-dropping-warmup: 100
[2021-04-02 20:06:20] [config] gradient-checkpointing: false
[2021-04-02 20:06:20] [config] guided-alignment: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-04-02 20:06:20] [config] guided-alignment-cost: mse
[2021-04-02 20:06:20] [config] guided-alignment-weight: 0.1
[2021-04-02 20:06:20] [config] ignore-model-config: false
[2021-04-02 20:06:20] [config] input-types:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] interpolate-env-vars: false
[2021-04-02 20:06:20] [config] keep-best: true
[2021-04-02 20:06:20] [config] label-smoothing: 0.1
[2021-04-02 20:06:20] [config] layer-normalization: false
[2021-04-02 20:06:20] [config] learn-rate: 0.0003
[2021-04-02 20:06:20] [config] lemma-dim-emb: 0
[2021-04-02 20:06:20] [config] log: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log
[2021-04-02 20:06:20] [config] log-level: info
[2021-04-02 20:06:20] [config] log-time-zone: ""
[2021-04-02 20:06:20] [config] logical-epoch:
[2021-04-02 20:06:20] [config] - 1e
[2021-04-02 20:06:20] [config] - 0
[2021-04-02 20:06:20] [config] lr-decay: 0
[2021-04-02 20:06:20] [config] lr-decay-freq: 50000
[2021-04-02 20:06:20] [config] lr-decay-inv-sqrt:
[2021-04-02 20:06:20] [config] - 16000
[2021-04-02 20:06:20] [config] lr-decay-repeat-warmup: false
[2021-04-02 20:06:20] [config] lr-decay-reset-optimizer: false
[2021-04-02 20:06:20] [config] lr-decay-start:
[2021-04-02 20:06:20] [config] - 10
[2021-04-02 20:06:20] [config] - 1
[2021-04-02 20:06:20] [config] lr-decay-strategy: epoch+stalled
[2021-04-02 20:06:20] [config] lr-report: true
[2021-04-02 20:06:20] [config] lr-warmup: 16000
[2021-04-02 20:06:20] [config] lr-warmup-at-reload: false
[2021-04-02 20:06:20] [config] lr-warmup-cycle: false
[2021-04-02 20:06:20] [config] lr-warmup-start-rate: 0
[2021-04-02 20:06:20] [config] max-length: 500
[2021-04-02 20:06:20] [config] max-length-crop: false
[2021-04-02 20:06:20] [config] max-length-factor: 3
[2021-04-02 20:06:20] [config] maxi-batch: 500
[2021-04-02 20:06:20] [config] maxi-batch-sort: trg
[2021-04-02 20:06:20] [config] mini-batch: 64
[2021-04-02 20:06:20] [config] mini-batch-fit: true
[2021-04-02 20:06:20] [config] mini-batch-fit-step: 10
[2021-04-02 20:06:20] [config] mini-batch-track-lr: false
[2021-04-02 20:06:20] [config] mini-batch-warmup: 0
[2021-04-02 20:06:20] [config] mini-batch-words: 0
[2021-04-02 20:06:20] [config] mini-batch-words-ref: 0
[2021-04-02 20:06:20] [config] model: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-02 20:06:20] [config] multi-loss-type: sum
[2021-04-02 20:06:20] [config] multi-node: false
[2021-04-02 20:06:20] [config] multi-node-overlap: true
[2021-04-02 20:06:20] [config] n-best: false
[2021-04-02 20:06:20] [config] no-nccl: false
[2021-04-02 20:06:20] [config] no-reload: false
[2021-04-02 20:06:20] [config] no-restore-corpus: false
[2021-04-02 20:06:20] [config] normalize: 1
[2021-04-02 20:06:20] [config] normalize-gradient: false
[2021-04-02 20:06:20] [config] num-devices: 0
[2021-04-02 20:06:20] [config] optimizer: adam
[2021-04-02 20:06:20] [config] optimizer-delay: 1
[2021-04-02 20:06:20] [config] optimizer-params:
[2021-04-02 20:06:20] [config] - 0.9
[2021-04-02 20:06:20] [config] - 0.98
[2021-04-02 20:06:20] [config] - 1e-09
[2021-04-02 20:06:20] [config] output-omit-bias: false
[2021-04-02 20:06:20] [config] overwrite: true
[2021-04-02 20:06:20] [config] precision:
[2021-04-02 20:06:20] [config] - float16
[2021-04-02 20:06:20] [config] - float32
[2021-04-02 20:06:20] [config] - float32
[2021-04-02 20:06:20] [config] pretrained-model: ""
[2021-04-02 20:06:20] [config] quantize-biases: false
[2021-04-02 20:06:20] [config] quantize-bits: 0
[2021-04-02 20:06:20] [config] quantize-log-based: false
[2021-04-02 20:06:20] [config] quantize-optimization-steps: 0
[2021-04-02 20:06:20] [config] quiet: false
[2021-04-02 20:06:20] [config] quiet-translation: false
[2021-04-02 20:06:20] [config] relative-paths: false
[2021-04-02 20:06:20] [config] right-left: false
[2021-04-02 20:06:20] [config] save-freq: 10000
[2021-04-02 20:06:20] [config] seed: 1111
[2021-04-02 20:06:20] [config] sentencepiece-alphas:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] sentencepiece-max-lines: 2000000
[2021-04-02 20:06:20] [config] sentencepiece-options: ""
[2021-04-02 20:06:20] [config] shuffle: data
[2021-04-02 20:06:20] [config] shuffle-in-ram: false
[2021-04-02 20:06:20] [config] sigterm: save-and-exit
[2021-04-02 20:06:20] [config] skip: false
[2021-04-02 20:06:20] [config] sqlite: temporary
[2021-04-02 20:06:20] [config] sqlite-drop: false
[2021-04-02 20:06:20] [config] sync-sgd: true
[2021-04-02 20:06:20] [config] tempdir: /run/nvme/job_5393846/data
[2021-04-02 20:06:20] [config] tied-embeddings: false
[2021-04-02 20:06:20] [config] tied-embeddings-all: true
[2021-04-02 20:06:20] [config] tied-embeddings-src: false
[2021-04-02 20:06:20] [config] train-embedder-rank:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] train-sets:
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz
[2021-04-02 20:06:20] [config] transformer-aan-activation: swish
[2021-04-02 20:06:20] [config] transformer-aan-depth: 2
[2021-04-02 20:06:20] [config] transformer-aan-nogate: false
[2021-04-02 20:06:20] [config] transformer-decoder-autoreg: self-attention
[2021-04-02 20:06:20] [config] transformer-depth-scaling: false
[2021-04-02 20:06:20] [config] transformer-dim-aan: 2048
[2021-04-02 20:06:20] [config] transformer-dim-ffn: 2048
[2021-04-02 20:06:20] [config] transformer-dropout: 0.1
[2021-04-02 20:06:20] [config] transformer-dropout-attention: 0
[2021-04-02 20:06:20] [config] transformer-dropout-ffn: 0
[2021-04-02 20:06:20] [config] transformer-ffn-activation: swish
[2021-04-02 20:06:20] [config] transformer-ffn-depth: 2
[2021-04-02 20:06:20] [config] transformer-guided-alignment-layer: last
[2021-04-02 20:06:20] [config] transformer-heads: 8
[2021-04-02 20:06:20] [config] transformer-no-projection: false
[2021-04-02 20:06:20] [config] transformer-pool: false
[2021-04-02 20:06:20] [config] transformer-postprocess: dan
[2021-04-02 20:06:20] [config] transformer-postprocess-emb: d
[2021-04-02 20:06:20] [config] transformer-postprocess-top: ""
[2021-04-02 20:06:20] [config] transformer-preprocess: ""
[2021-04-02 20:06:20] [config] transformer-tied-layers:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] transformer-train-position-embeddings: false
[2021-04-02 20:06:20] [config] tsv: false
[2021-04-02 20:06:20] [config] tsv-fields: 0
[2021-04-02 20:06:20] [config] type: transformer
[2021-04-02 20:06:20] [config] ulr: false
[2021-04-02 20:06:20] [config] ulr-dim-emb: 0
[2021-04-02 20:06:20] [config] ulr-dropout: 0
[2021-04-02 20:06:20] [config] ulr-keys-vectors: ""
[2021-04-02 20:06:20] [config] ulr-query-vectors: ""
[2021-04-02 20:06:20] [config] ulr-softmax-temperature: 1
[2021-04-02 20:06:20] [config] ulr-trainable-transformation: false
[2021-04-02 20:06:20] [config] unlikelihood-loss: false
[2021-04-02 20:06:20] [config] valid-freq: 10000
[2021-04-02 20:06:20] [config] valid-log: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log
[2021-04-02 20:06:20] [config] valid-max-length: 1000
[2021-04-02 20:06:20] [config] valid-metrics:
[2021-04-02 20:06:20] [config] - perplexity
[2021-04-02 20:06:20] [config] valid-mini-batch: 16
[2021-04-02 20:06:20] [config] valid-reset-stalled: false
[2021-04-02 20:06:20] [config] valid-script-args:
[2021-04-02 20:06:20] [config] []
[2021-04-02 20:06:20] [config] valid-script-path: ""
[2021-04-02 20:06:20] [config] valid-sets:
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k
[2021-04-02 20:06:20] [config] valid-translation-output: ""
[2021-04-02 20:06:20] [config] version: v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-02 20:06:20] [config] vocabs:
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-02 20:06:20] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-02 20:06:20] [config] word-penalty: 0
[2021-04-02 20:06:20] [config] word-scores: false
[2021-04-02 20:06:20] [config] workspace: 24000
[2021-04-02 20:06:20] [config] Loaded model has been created with Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-02 20:06:20] Using synchronous SGD
[2021-04-02 20:06:20] [data] Loading vocabulary from JSON/Yaml file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-02 20:06:20] [data] Setting vocabulary size for input 0 to 60,878
[2021-04-02 20:06:20] [data] Loading vocabulary from JSON/Yaml file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-02 20:06:21] [data] Setting vocabulary size for input 1 to 60,878
[2021-04-02 20:06:21] [data] Using word alignments from file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-04-02 20:06:21] [sqlite] Creating temporary database in /run/nvme/job_5393846/data
[2021-04-02 20:06:25] [sqlite] Inserted 1000000 lines
[2021-04-02 20:06:29] [sqlite] Inserted 2000000 lines
[2021-04-02 20:06:37] [sqlite] Inserted 4000000 lines
[2021-04-02 20:06:52] [sqlite] Inserted 8000000 lines
[2021-04-02 20:07:24] [sqlite] Inserted 16000000 lines
[2021-04-02 20:08:26] [sqlite] Inserted 32000000 lines
[2021-04-02 20:09:57] [sqlite] Inserted 64000000 lines
[2021-04-02 20:11:39] [sqlite] Inserted 90910822 lines
[2021-04-02 20:11:39] [sqlite] Creating primary index
[2021-04-02 20:12:26] [comm] Compiled without MPI support. Running as a single process on r03g02.bullx
[2021-04-02 20:12:26] [batching] Collecting statistics for batch fitting with step size 10
[2021-04-02 20:12:37] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-04-02 20:12:38] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-04-02 20:12:39] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-04-02 20:12:39] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-04-02 20:12:39] [comm] Using NCCL 2.8.3 for GPU communication
[2021-04-02 20:12:41] [comm] NCCLCommunicator constructed successfully
[2021-04-02 20:12:41] [training] Using 4 GPUs
[2021-04-02 20:12:41] [logits] Applying loss function for 1 factor(s)
[2021-04-02 20:12:41] [memory] Reserving 287 MB, device gpu0
[2021-04-02 20:12:42] [gpu] 16-bit TensorCores enabled for float32 matrix operations
[2021-04-02 20:12:43] [memory] Reserving 287 MB, device gpu0
[2021-04-02 20:15:28] [batching] Done. Typical MB size is 57,739 target words
[2021-04-02 20:15:29] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-04-02 20:15:29] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-04-02 20:15:29] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-04-02 20:15:30] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-04-02 20:15:30] [comm] Using NCCL 2.8.3 for GPU communication
[2021-04-02 20:15:31] [comm] NCCLCommunicator constructed successfully
[2021-04-02 20:15:31] [training] Using 4 GPUs
[2021-04-02 20:15:31] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-02 20:15:32] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-02 20:15:32] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-02 20:15:33] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-02 20:15:33] Loading Adam parameters from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-02 20:15:35] [memory] Reserving 143 MB, device gpu0
[2021-04-02 20:15:35] [memory] Reserving 143 MB, device gpu1
[2021-04-02 20:15:35] [memory] Reserving 143 MB, device gpu2
[2021-04-02 20:15:35] [memory] Reserving 143 MB, device gpu3
[2021-04-02 20:15:35] [training] Model reloaded from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-02 20:15:35] [data] Restoring the corpus state to epoch 1, batch 10000
[2021-04-02 20:15:35] [sqlite] Selecting shuffled data
[2021-04-02 20:20:55] Training started
[2021-04-02 20:20:55] [training] Batches are processed as 1 process(es) x 4 devices/process
[2021-04-02 20:20:55] [memory] Reserving 287 MB, device gpu0
[2021-04-02 20:20:55] [memory] Reserving 287 MB, device gpu1
[2021-04-02 20:20:55] [memory] Reserving 287 MB, device gpu3
[2021-04-02 20:20:55] [memory] Reserving 287 MB, device gpu2
[2021-04-02 20:20:56] [memory] Reserving 287 MB, device gpu0
[2021-04-02 20:20:56] [memory] Reserving 287 MB, device gpu1
[2021-04-02 20:20:56] [memory] Reserving 287 MB, device gpu3
[2021-04-02 20:20:57] [memory] Reserving 287 MB, device gpu2
[2021-04-02 20:20:57] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-02 20:20:58] [memory] Reserving 287 MB, device cpu0
[2021-04-02 20:20:58] [memory] Reserving 71 MB, device gpu0
[2021-04-02 20:20:58] [memory] Reserving 71 MB, device gpu1
[2021-04-02 20:20:58] [memory] Reserving 71 MB, device gpu2
[2021-04-02 20:20:58] [memory] Reserving 71 MB, device gpu3
[2021-04-02 22:25:35] Ep. 1 : Up. 20000 : Sen. 15,852,639 : Cost 0.42225203 * 1,534,689,396 @ 21,129 after 3,074,337,819 : Time 7806.09s : 24575.01 words/s : L.r. 2.6833e-04
[2021-04-02 22:25:35] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-02 22:25:38] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-02 22:25:40] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-02 22:25:47] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-02 22:25:48] [valid] Ep. 1 : Up. 20000 : perplexity : 3.20918 : new best
[2021-04-10 20:28:59] [marian] Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-10 20:28:59] [marian] Running on r13g05.bullx as process 132622 with command line:
[2021-04-10 20:28:59] [marian] /projappl/project_2001194/marian/build/marian --guided-alignment /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz --early-stopping 15 --valid-freq 10000 --valid-sets /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k --valid-metrics perplexity --valid-mini-batch 16 --valid-log /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log --beam-size 12 --normalize 1 --allow-unk --overwrite --keep-best --model /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz --type transformer --train-sets /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz --max-length 500 --vocabs /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml --mini-batch-fit -w 24000 --maxi-batch 500 --save-freq 10000 --disp-freq 10000 --log /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log --enc-depth 6 --dec-depth 6 --transformer-heads 8 --transformer-postprocess-emb d --transformer-postprocess dan --transformer-dropout 0.1 --label-smoothing 0.1 --learn-rate 0.0003 --lr-warmup 16000 --lr-decay-inv-sqrt 16000 --lr-report --optimizer-params 0.9 0.98 1e-09 --clip-norm 5 --fp16 --tied-embeddings-all --devices 0 1 2 3 --sync-sgd --seed 1111 --sqlite --tempdir /run/nvme/job_5481217/data --exponential-smoothing
[2021-04-10 20:29:01] [config] after: 0e
[2021-04-10 20:29:01] [config] after-batches: 0
[2021-04-10 20:29:01] [config] after-epochs: 0
[2021-04-10 20:29:01] [config] all-caps-every: 0
[2021-04-10 20:29:01] [config] allow-unk: true
[2021-04-10 20:29:01] [config] authors: false
[2021-04-10 20:29:01] [config] beam-size: 12
[2021-04-10 20:29:01] [config] bert-class-symbol: "[CLS]"
[2021-04-10 20:29:01] [config] bert-mask-symbol: "[MASK]"
[2021-04-10 20:29:01] [config] bert-masking-fraction: 0.15
[2021-04-10 20:29:01] [config] bert-sep-symbol: "[SEP]"
[2021-04-10 20:29:01] [config] bert-train-type-embeddings: true
[2021-04-10 20:29:01] [config] bert-type-vocab-size: 2
[2021-04-10 20:29:01] [config] build-info: ""
[2021-04-10 20:29:01] [config] cite: false
[2021-04-10 20:29:01] [config] clip-norm: 5
[2021-04-10 20:29:01] [config] cost-scaling:
[2021-04-10 20:29:01] [config] - 7
[2021-04-10 20:29:01] [config] - 2000
[2021-04-10 20:29:01] [config] - 2
[2021-04-10 20:29:01] [config] - 0.05
[2021-04-10 20:29:01] [config] - 10
[2021-04-10 20:29:01] [config] - 1
[2021-04-10 20:29:01] [config] cost-type: ce-sum
[2021-04-10 20:29:01] [config] cpu-threads: 0
[2021-04-10 20:29:01] [config] data-weighting: ""
[2021-04-10 20:29:01] [config] data-weighting-type: sentence
[2021-04-10 20:29:01] [config] dec-cell: gru
[2021-04-10 20:29:01] [config] dec-cell-base-depth: 2
[2021-04-10 20:29:01] [config] dec-cell-high-depth: 1
[2021-04-10 20:29:01] [config] dec-depth: 6
[2021-04-10 20:29:01] [config] devices:
[2021-04-10 20:29:01] [config] - 0
[2021-04-10 20:29:01] [config] - 1
[2021-04-10 20:29:01] [config] - 2
[2021-04-10 20:29:01] [config] - 3
[2021-04-10 20:29:01] [config] dim-emb: 512
[2021-04-10 20:29:01] [config] dim-rnn: 1024
[2021-04-10 20:29:01] [config] dim-vocabs:
[2021-04-10 20:29:01] [config] - 60878
[2021-04-10 20:29:01] [config] - 60878
[2021-04-10 20:29:01] [config] disp-first: 0
[2021-04-10 20:29:01] [config] disp-freq: 10000
[2021-04-10 20:29:01] [config] disp-label-counts: true
[2021-04-10 20:29:01] [config] dropout-rnn: 0
[2021-04-10 20:29:01] [config] dropout-src: 0
[2021-04-10 20:29:01] [config] dropout-trg: 0
[2021-04-10 20:29:01] [config] dump-config: ""
[2021-04-10 20:29:01] [config] early-stopping: 15
[2021-04-10 20:29:01] [config] embedding-fix-src: false
[2021-04-10 20:29:01] [config] embedding-fix-trg: false
[2021-04-10 20:29:01] [config] embedding-normalization: false
[2021-04-10 20:29:01] [config] embedding-vectors:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] enc-cell: gru
[2021-04-10 20:29:01] [config] enc-cell-depth: 1
[2021-04-10 20:29:01] [config] enc-depth: 6
[2021-04-10 20:29:01] [config] enc-type: bidirectional
[2021-04-10 20:29:01] [config] english-title-case-every: 0
[2021-04-10 20:29:01] [config] exponential-smoothing: 0.0001
[2021-04-10 20:29:01] [config] factor-weight: 1
[2021-04-10 20:29:01] [config] grad-dropping-momentum: 0
[2021-04-10 20:29:01] [config] grad-dropping-rate: 0
[2021-04-10 20:29:01] [config] grad-dropping-warmup: 100
[2021-04-10 20:29:01] [config] gradient-checkpointing: false
[2021-04-10 20:29:01] [config] guided-alignment: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-04-10 20:29:01] [config] guided-alignment-cost: mse
[2021-04-10 20:29:01] [config] guided-alignment-weight: 0.1
[2021-04-10 20:29:01] [config] ignore-model-config: false
[2021-04-10 20:29:01] [config] input-types:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] interpolate-env-vars: false
[2021-04-10 20:29:01] [config] keep-best: true
[2021-04-10 20:29:01] [config] label-smoothing: 0.1
[2021-04-10 20:29:01] [config] layer-normalization: false
[2021-04-10 20:29:01] [config] learn-rate: 0.0003
[2021-04-10 20:29:01] [config] lemma-dim-emb: 0
[2021-04-10 20:29:01] [config] log: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.train1.log
[2021-04-10 20:29:01] [config] log-level: info
[2021-04-10 20:29:01] [config] log-time-zone: ""
[2021-04-10 20:29:01] [config] logical-epoch:
[2021-04-10 20:29:01] [config] - 1e
[2021-04-10 20:29:01] [config] - 0
[2021-04-10 20:29:01] [config] lr-decay: 0
[2021-04-10 20:29:01] [config] lr-decay-freq: 50000
[2021-04-10 20:29:01] [config] lr-decay-inv-sqrt:
[2021-04-10 20:29:01] [config] - 16000
[2021-04-10 20:29:01] [config] lr-decay-repeat-warmup: false
[2021-04-10 20:29:01] [config] lr-decay-reset-optimizer: false
[2021-04-10 20:29:01] [config] lr-decay-start:
[2021-04-10 20:29:01] [config] - 10
[2021-04-10 20:29:01] [config] - 1
[2021-04-10 20:29:01] [config] lr-decay-strategy: epoch+stalled
[2021-04-10 20:29:01] [config] lr-report: true
[2021-04-10 20:29:01] [config] lr-warmup: 16000
[2021-04-10 20:29:01] [config] lr-warmup-at-reload: false
[2021-04-10 20:29:01] [config] lr-warmup-cycle: false
[2021-04-10 20:29:01] [config] lr-warmup-start-rate: 0
[2021-04-10 20:29:01] [config] max-length: 500
[2021-04-10 20:29:01] [config] max-length-crop: false
[2021-04-10 20:29:01] [config] max-length-factor: 3
[2021-04-10 20:29:01] [config] maxi-batch: 500
[2021-04-10 20:29:01] [config] maxi-batch-sort: trg
[2021-04-10 20:29:01] [config] mini-batch: 64
[2021-04-10 20:29:01] [config] mini-batch-fit: true
[2021-04-10 20:29:01] [config] mini-batch-fit-step: 10
[2021-04-10 20:29:01] [config] mini-batch-track-lr: false
[2021-04-10 20:29:01] [config] mini-batch-warmup: 0
[2021-04-10 20:29:01] [config] mini-batch-words: 0
[2021-04-10 20:29:01] [config] mini-batch-words-ref: 0
[2021-04-10 20:29:01] [config] model: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-10 20:29:01] [config] multi-loss-type: sum
[2021-04-10 20:29:01] [config] multi-node: false
[2021-04-10 20:29:01] [config] multi-node-overlap: true
[2021-04-10 20:29:01] [config] n-best: false
[2021-04-10 20:29:01] [config] no-nccl: false
[2021-04-10 20:29:01] [config] no-reload: false
[2021-04-10 20:29:01] [config] no-restore-corpus: false
[2021-04-10 20:29:01] [config] normalize: 1
[2021-04-10 20:29:01] [config] normalize-gradient: false
[2021-04-10 20:29:01] [config] num-devices: 0
[2021-04-10 20:29:01] [config] optimizer: adam
[2021-04-10 20:29:01] [config] optimizer-delay: 1
[2021-04-10 20:29:01] [config] optimizer-params:
[2021-04-10 20:29:01] [config] - 0.9
[2021-04-10 20:29:01] [config] - 0.98
[2021-04-10 20:29:01] [config] - 1e-09
[2021-04-10 20:29:01] [config] output-omit-bias: false
[2021-04-10 20:29:01] [config] overwrite: true
[2021-04-10 20:29:01] [config] precision:
[2021-04-10 20:29:01] [config] - float16
[2021-04-10 20:29:01] [config] - float32
[2021-04-10 20:29:01] [config] - float32
[2021-04-10 20:29:01] [config] pretrained-model: ""
[2021-04-10 20:29:01] [config] quantize-biases: false
[2021-04-10 20:29:01] [config] quantize-bits: 0
[2021-04-10 20:29:01] [config] quantize-log-based: false
[2021-04-10 20:29:01] [config] quantize-optimization-steps: 0
[2021-04-10 20:29:01] [config] quiet: false
[2021-04-10 20:29:01] [config] quiet-translation: false
[2021-04-10 20:29:01] [config] relative-paths: false
[2021-04-10 20:29:01] [config] right-left: false
[2021-04-10 20:29:01] [config] save-freq: 10000
[2021-04-10 20:29:01] [config] seed: 1111
[2021-04-10 20:29:01] [config] sentencepiece-alphas:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] sentencepiece-max-lines: 2000000
[2021-04-10 20:29:01] [config] sentencepiece-options: ""
[2021-04-10 20:29:01] [config] shuffle: data
[2021-04-10 20:29:01] [config] shuffle-in-ram: false
[2021-04-10 20:29:01] [config] sigterm: save-and-exit
[2021-04-10 20:29:01] [config] skip: false
[2021-04-10 20:29:01] [config] sqlite: temporary
[2021-04-10 20:29:01] [config] sqlite-drop: false
[2021-04-10 20:29:01] [config] sync-sgd: true
[2021-04-10 20:29:01] [config] tempdir: /run/nvme/job_5481217/data
[2021-04-10 20:29:01] [config] tied-embeddings: false
[2021-04-10 20:29:01] [config] tied-embeddings-all: true
[2021-04-10 20:29:01] [config] tied-embeddings-src: false
[2021-04-10 20:29:01] [config] train-embedder-rank:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] train-sets:
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.src.clean.spm32k.gz
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.trg.clean.spm32k.gz
[2021-04-10 20:29:01] [config] transformer-aan-activation: swish
[2021-04-10 20:29:01] [config] transformer-aan-depth: 2
[2021-04-10 20:29:01] [config] transformer-aan-nogate: false
[2021-04-10 20:29:01] [config] transformer-decoder-autoreg: self-attention
[2021-04-10 20:29:01] [config] transformer-depth-scaling: false
[2021-04-10 20:29:01] [config] transformer-dim-aan: 2048
[2021-04-10 20:29:01] [config] transformer-dim-ffn: 2048
[2021-04-10 20:29:01] [config] transformer-dropout: 0.1
[2021-04-10 20:29:01] [config] transformer-dropout-attention: 0
[2021-04-10 20:29:01] [config] transformer-dropout-ffn: 0
[2021-04-10 20:29:01] [config] transformer-ffn-activation: swish
[2021-04-10 20:29:01] [config] transformer-ffn-depth: 2
[2021-04-10 20:29:01] [config] transformer-guided-alignment-layer: last
[2021-04-10 20:29:01] [config] transformer-heads: 8
[2021-04-10 20:29:01] [config] transformer-no-projection: false
[2021-04-10 20:29:01] [config] transformer-pool: false
[2021-04-10 20:29:01] [config] transformer-postprocess: dan
[2021-04-10 20:29:01] [config] transformer-postprocess-emb: d
[2021-04-10 20:29:01] [config] transformer-postprocess-top: ""
[2021-04-10 20:29:01] [config] transformer-preprocess: ""
[2021-04-10 20:29:01] [config] transformer-tied-layers:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] transformer-train-position-embeddings: false
[2021-04-10 20:29:01] [config] tsv: false
[2021-04-10 20:29:01] [config] tsv-fields: 0
[2021-04-10 20:29:01] [config] type: transformer
[2021-04-10 20:29:01] [config] ulr: false
[2021-04-10 20:29:01] [config] ulr-dim-emb: 0
[2021-04-10 20:29:01] [config] ulr-dropout: 0
[2021-04-10 20:29:01] [config] ulr-keys-vectors: ""
[2021-04-10 20:29:01] [config] ulr-query-vectors: ""
[2021-04-10 20:29:01] [config] ulr-softmax-temperature: 1
[2021-04-10 20:29:01] [config] ulr-trainable-transformation: false
[2021-04-10 20:29:01] [config] unlikelihood-loss: false
[2021-04-10 20:29:01] [config] valid-freq: 10000
[2021-04-10 20:29:01] [config] valid-log: /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.valid1.log
[2021-04-10 20:29:01] [config] valid-max-length: 1000
[2021-04-10 20:29:01] [config] valid-metrics:
[2021-04-10 20:29:01] [config] - perplexity
[2021-04-10 20:29:01] [config] valid-mini-batch: 16
[2021-04-10 20:29:01] [config] valid-reset-stalled: false
[2021-04-10 20:29:01] [config] valid-script-args:
[2021-04-10 20:29:01] [config] []
[2021-04-10 20:29:01] [config] valid-script-path: ""
[2021-04-10 20:29:01] [config] valid-sets:
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.src.spm32k
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/val/Tatoeba-dev.trg.spm32k
[2021-04-10 20:29:01] [config] valid-translation-output: ""
[2021-04-10 20:29:01] [config] version: v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-10 20:29:01] [config] vocabs:
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-10 20:29:01] [config] - /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-10 20:29:01] [config] word-penalty: 0
[2021-04-10 20:29:01] [config] word-scores: false
[2021-04-10 20:29:01] [config] workspace: 24000
[2021-04-10 20:29:01] [config] Loaded model has been created with Marian v1.10.0 6f6d484 2021-02-06 15:35:16 -0800
[2021-04-10 20:29:01] Using synchronous SGD
[2021-04-10 20:29:01] [data] Loading vocabulary from JSON/Yaml file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-10 20:29:02] [data] Setting vocabulary size for input 0 to 60,878
[2021-04-10 20:29:02] [data] Loading vocabulary from JSON/Yaml file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.vocab.yml
[2021-04-10 20:29:02] [data] Setting vocabulary size for input 1 to 60,878
[2021-04-10 20:29:02] [data] Using word alignments from file /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/train/opus+bt.spm32k-spm32k.src-trg.alg.gz
[2021-04-10 20:29:02] [sqlite] Creating temporary database in /run/nvme/job_5481217/data
[2021-04-10 20:29:06] [sqlite] Inserted 1000000 lines
[2021-04-10 20:29:10] [sqlite] Inserted 2000000 lines
[2021-04-10 20:29:18] [sqlite] Inserted 4000000 lines
[2021-04-10 20:29:33] [sqlite] Inserted 8000000 lines
[2021-04-10 20:30:05] [sqlite] Inserted 16000000 lines
[2021-04-10 20:31:08] [sqlite] Inserted 32000000 lines
[2021-04-10 20:32:38] [sqlite] Inserted 64000000 lines
[2021-04-10 20:34:20] [sqlite] Inserted 90910822 lines
[2021-04-10 20:34:20] [sqlite] Creating primary index
[2021-04-10 20:35:07] [comm] Compiled without MPI support. Running as a single process on r13g05.bullx
[2021-04-10 20:35:07] [batching] Collecting statistics for batch fitting with step size 10
[2021-04-10 20:35:20] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-04-10 20:35:20] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-04-10 20:35:21] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-04-10 20:35:21] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-04-10 20:35:21] [comm] Using NCCL 2.8.3 for GPU communication
[2021-04-10 20:35:23] [comm] NCCLCommunicator constructed successfully
[2021-04-10 20:35:23] [training] Using 4 GPUs
[2021-04-10 20:35:23] [logits] Applying loss function for 1 factor(s)
[2021-04-10 20:35:23] [memory] Reserving 287 MB, device gpu0
[2021-04-10 20:35:23] [gpu] 16-bit TensorCores enabled for float32 matrix operations
[2021-04-10 20:35:24] [memory] Reserving 287 MB, device gpu0
[2021-04-10 20:38:11] [batching] Done. Typical MB size is 57,739 target words
[2021-04-10 20:38:11] [memory] Extending reserved space to 24064 MB (device gpu0)
[2021-04-10 20:38:11] [memory] Extending reserved space to 24064 MB (device gpu1)
[2021-04-10 20:38:11] [memory] Extending reserved space to 24064 MB (device gpu2)
[2021-04-10 20:38:11] [memory] Extending reserved space to 24064 MB (device gpu3)
[2021-04-10 20:38:11] [comm] Using NCCL 2.8.3 for GPU communication
[2021-04-10 20:38:13] [comm] NCCLCommunicator constructed successfully
[2021-04-10 20:38:13] [training] Using 4 GPUs
[2021-04-10 20:38:13] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-10 20:38:13] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-10 20:38:14] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-10 20:38:14] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-10 20:38:15] Loading Adam parameters from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-10 20:38:16] [memory] Reserving 143 MB, device gpu0
[2021-04-10 20:38:16] [memory] Reserving 143 MB, device gpu1
[2021-04-10 20:38:16] [memory] Reserving 143 MB, device gpu2
[2021-04-10 20:38:16] [memory] Reserving 143 MB, device gpu3
[2021-04-10 20:38:17] [training] Model reloaded from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-10 20:38:17] [data] Restoring the corpus state to epoch 1, batch 20000
[2021-04-10 20:38:17] [sqlite] Selecting shuffled data
[2021-04-10 20:47:06] Training started
[2021-04-10 20:47:59] [training] Batches are processed as 1 process(es) x 4 devices/process
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu0
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu1
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu3
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu2
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu1
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu0
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu2
[2021-04-10 20:47:59] [memory] Reserving 287 MB, device gpu3
[2021-04-10 20:47:59] Loading model from /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-10 20:48:01] [memory] Reserving 287 MB, device cpu0
[2021-04-10 20:48:01] [memory] Reserving 71 MB, device gpu0
[2021-04-10 20:48:01] [memory] Reserving 71 MB, device gpu1
[2021-04-10 20:48:01] [memory] Reserving 71 MB, device gpu2
[2021-04-10 20:48:01] [memory] Reserving 71 MB, device gpu3
[2021-04-10 22:53:55] Ep. 1 : Up. 30000 : Sen. 23,809,627 : Cost 0.37059498 * 1,537,476,548 @ 19,307 after 4,611,814,367 : Time 8144.22s : 23660.41 words/s : L.r. 2.1909e-04
[2021-04-10 22:53:55] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-10 22:53:57] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-10 22:53:59] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-10 22:54:06] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-10 22:54:07] [valid] Ep. 1 : Up. 30000 : perplexity : 2.74846 : new best
[2021-04-11 01:00:17] Ep. 1 : Up. 40000 : Sen. 31,775,328 : Cost 0.35317293 * 1,542,109,245 @ 10,837 after 6,153,923,612 : Time 7581.80s : 25443.81 words/s : L.r. 1.8974e-04
[2021-04-11 01:00:17] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 01:00:19] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 01:00:21] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 01:00:27] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 01:00:28] [valid] Ep. 1 : Up. 40000 : perplexity : 2.57709 : new best
[2021-04-11 03:06:32] Ep. 1 : Up. 50000 : Sen. 39,739,557 : Cost 0.34560379 * 1,537,112,128 @ 25,070 after 7,691,035,740 : Time 7574.64s : 25455.78 words/s : L.r. 1.6971e-04
[2021-04-11 03:06:32] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 03:06:33] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 03:06:35] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 03:06:41] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 03:06:42] [valid] Ep. 1 : Up. 50000 : perplexity : 2.48823 : new best
[2021-04-11 05:13:04] Ep. 1 : Up. 60000 : Sen. 47,717,024 : Cost 0.33947721 * 1,544,423,387 @ 29,067 after 9,235,459,127 : Time 7591.92s : 25468.52 words/s : L.r. 1.5492e-04
[2021-04-11 05:13:04] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 05:13:05] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 05:13:07] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 05:13:13] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 05:13:14] [valid] Ep. 1 : Up. 60000 : perplexity : 2.42648 : new best
[2021-04-11 07:19:16] Ep. 1 : Up. 70000 : Sen. 55,678,568 : Cost 0.33626568 * 1,536,569,921 @ 14,568 after 10,772,029,048 : Time 7572.11s : 25449.14 words/s : L.r. 1.4343e-04
[2021-04-11 07:19:16] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 07:19:18] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 07:19:19] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 07:19:26] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 07:19:26] [valid] Ep. 1 : Up. 70000 : perplexity : 2.38283 : new best
[2021-04-11 09:25:48] Ep. 1 : Up. 80000 : Sen. 63,653,841 : Cost 0.33299673 * 1,541,968,996 @ 16,641 after 12,313,998,044 : Time 7591.60s : 25446.59 words/s : L.r. 1.3416e-04
[2021-04-11 09:25:48] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 09:25:50] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 09:25:51] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 09:25:58] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 09:25:59] [valid] Ep. 1 : Up. 80000 : perplexity : 2.35194 : new best
[2021-04-11 11:32:04] Ep. 1 : Up. 90000 : Sen. 71,619,033 : Cost 0.33060694 * 1,540,330,654 @ 16,817 after 13,854,328,698 : Time 7575.88s : 25458.47 words/s : L.r. 1.2649e-04
[2021-04-11 11:32:04] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 11:32:06] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 11:32:08] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 11:32:15] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 11:32:15] [valid] Ep. 1 : Up. 90000 : perplexity : 2.3261 : new best
[2021-04-11 13:38:41] Ep. 1 : Up. 100000 : Sen. 79,596,476 : Cost 0.32837385 * 1,544,307,939 @ 14,566 after 15,398,636,637 : Time 7596.82s : 25424.10 words/s : L.r. 1.2000e-04
[2021-04-11 13:38:41] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 13:38:43] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 13:38:45] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 13:38:51] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 13:38:52] [valid] Ep. 1 : Up. 100000 : perplexity : 2.30587 : new best
[2021-04-11 15:45:05] Ep. 1 : Up. 110000 : Sen. 87,552,837 : Cost 0.32707089 * 1,539,439,336 @ 23,740 after 16,938,075,973 : Time 7584.29s : 25408.62 words/s : L.r. 1.1442e-04
[2021-04-11 15:45:05] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 15:45:08] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 15:45:09] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 15:45:16] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 15:45:17] [valid] Ep. 1 : Up. 110000 : perplexity : 2.28881 : new best
[2021-04-11 16:38:26] Seen 90910822 samples
[2021-04-11 16:38:26] Starting data epoch 2 in logical epoch 2
[2021-04-11 16:38:26] [sqlite] Selecting shuffled data
[2021-04-11 17:53:05] Ep. 2 : Up. 120000 : Sen. 4,604,102 : Cost 0.32645020 * 1,535,115,036 @ 21,005 after 18,473,191,009 : Time 7678.92s : 25099.86 words/s : L.r. 1.0954e-04
[2021-04-11 17:53:05] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 17:53:07] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 17:53:09] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 17:53:15] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 17:53:16] [valid] Ep. 2 : Up. 120000 : perplexity : 2.27459 : new best
[2021-04-11 19:59:32] Ep. 2 : Up. 130000 : Sen. 12,580,452 : Cost 0.32457662 * 1,541,876,924 @ 17,454 after 20,015,067,933 : Time 7587.52s : 25462.31 words/s : L.r. 1.0525e-04
[2021-04-11 19:59:32] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 19:59:34] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 19:59:36] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 19:59:42] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 19:59:43] [valid] Ep. 2 : Up. 130000 : perplexity : 2.2625 : new best
[2021-04-11 22:05:59] Ep. 2 : Up. 140000 : Sen. 20,552,521 : Cost 0.32394958 * 1,540,359,478 @ 32,921 after 21,555,427,411 : Time 7586.69s : 25455.14 words/s : L.r. 1.0142e-04
[2021-04-11 22:05:59] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-11 22:06:01] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-11 22:06:03] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-11 22:06:09] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-11 22:06:10] [valid] Ep. 2 : Up. 140000 : perplexity : 2.25244 : new best
[2021-04-12 00:12:10] Ep. 2 : Up. 150000 : Sen. 28,516,342 : Cost 0.32292783 * 1,538,883,913 @ 15,698 after 23,094,311,324 : Time 7571.13s : 25470.97 words/s : L.r. 9.7980e-05
[2021-04-12 00:12:10] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 00:12:12] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 00:12:14] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 00:12:20] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 00:12:21] [valid] Ep. 2 : Up. 150000 : perplexity : 2.24363 : new best
[2021-04-12 02:18:19] Ep. 2 : Up. 160000 : Sen. 36,472,921 : Cost 0.32204559 * 1,538,444,007 @ 14,634 after 24,632,755,331 : Time 7568.51s : 25454.85 words/s : L.r. 9.4868e-05
[2021-04-12 02:18:19] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 02:18:21] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 02:18:23] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 02:18:29] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 02:18:30] [valid] Ep. 2 : Up. 160000 : perplexity : 2.23516 : new best
[2021-04-12 04:24:41] Ep. 2 : Up. 170000 : Sen. 44,432,707 : Cost 0.32138234 * 1,539,440,066 @ 20,548 after 26,172,195,397 : Time 7582.03s : 25428.92 words/s : L.r. 9.2036e-05
[2021-04-12 04:24:41] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 04:24:43] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 04:24:45] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 04:24:51] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 04:24:52] [valid] Ep. 2 : Up. 170000 : perplexity : 2.22792 : new best
[2021-04-12 06:30:29] Ep. 2 : Up. 180000 : Sen. 52,350,704 : Cost 0.32022747 * 1,534,576,184 @ 19,611 after 27,706,771,581 : Time 7547.89s : 25419.56 words/s : L.r. 8.9443e-05
[2021-04-12 06:30:29] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 06:30:31] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 06:30:33] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 06:30:39] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 06:30:40] [valid] Ep. 2 : Up. 180000 : perplexity : 2.22167 : new best
[2021-04-12 08:36:55] Ep. 2 : Up. 190000 : Sen. 60,324,297 : Cost 0.32058367 * 1,540,880,487 @ 8,037 after 29,247,652,068 : Time 7585.61s : 25468.44 words/s : L.r. 8.7057e-05
[2021-04-12 08:36:55] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 08:36:57] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 08:36:59] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 08:37:05] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 08:37:06] [valid] Ep. 2 : Up. 190000 : perplexity : 2.2181 : new best
[2021-04-12 10:43:15] Ep. 2 : Up. 200000 : Sen. 68,291,176 : Cost 0.31953624 * 1,540,464,828 @ 14,844 after 30,788,116,896 : Time 7580.26s : 25442.26 words/s : L.r. 8.4853e-05
[2021-04-12 10:43:16] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 10:43:17] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 10:43:19] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 10:43:25] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 10:43:26] [valid] Ep. 2 : Up. 200000 : perplexity : 2.21232 : new best
[2021-04-12 12:49:55] Ep. 2 : Up. 210000 : Sen. 76,282,171 : Cost 0.31941819 * 1,543,300,054 @ 17,957 after 32,331,416,950 : Time 7599.36s : 25456.48 words/s : L.r. 8.2808e-05
[2021-04-12 12:49:55] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 12:49:57] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 12:50:00] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 12:50:06] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 12:50:07] [valid] Ep. 2 : Up. 210000 : perplexity : 2.20852 : new best
[2021-04-12 14:56:06] Ep. 2 : Up. 220000 : Sen. 84,244,657 : Cost 0.31943667 * 1,537,478,561 @ 18,972 after 33,868,895,511 : Time 7571.45s : 25486.67 words/s : L.r. 8.0904e-05
[2021-04-12 14:56:06] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 14:56:08] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 14:56:10] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 14:56:16] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 14:56:17] [valid] Ep. 2 : Up. 220000 : perplexity : 2.2026 : new best
[2021-04-12 16:41:55] Seen 90910822 samples
[2021-04-12 16:41:55] Starting data epoch 3 in logical epoch 3
[2021-04-12 16:41:55] [sqlite] Selecting shuffled data
[2021-04-12 17:04:47] Ep. 3 : Up. 230000 : Sen. 1,324,712 : Cost 0.31791928 * 1,546,400,571 @ 27,844 after 35,415,296,082 : Time 7720.20s : 25068.34 words/s : L.r. 7.9126e-05
[2021-04-12 17:04:47] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 17:04:48] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 17:04:50] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 17:04:56] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 17:04:57] [valid] Ep. 3 : Up. 230000 : perplexity : 2.19778 : new best
[2021-04-12 19:10:47] Ep. 3 : Up. 240000 : Sen. 9,250,409 : Cost 0.31746748 * 1,532,637,149 @ 31,358 after 36,947,933,231 : Time 7559.74s : 25399.67 words/s : L.r. 7.7460e-05
[2021-04-12 19:10:47] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 19:10:49] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 19:10:50] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 19:10:57] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 19:10:58] [valid] Ep. 3 : Up. 240000 : perplexity : 2.19337 : new best
[2021-04-12 21:16:59] Ep. 3 : Up. 250000 : Sen. 17,210,132 : Cost 0.31740338 * 1,537,883,637 @ 28,824 after 38,485,816,868 : Time 7572.71s : 25453.29 words/s : L.r. 7.5895e-05
[2021-04-12 21:16:59] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 21:17:01] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 21:17:03] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 21:17:09] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 21:17:10] [valid] Ep. 3 : Up. 250000 : perplexity : 2.19031 : new best
[2021-04-12 23:22:59] Ep. 3 : Up. 260000 : Sen. 25,147,658 : Cost 0.31638047 * 1,537,429,791 @ 9,960 after 40,023,246,659 : Time 7559.84s : 25428.46 words/s : L.r. 7.4421e-05
[2021-04-12 23:22:59] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-12 23:23:01] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-12 23:23:03] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-12 23:23:10] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-12 23:23:12] [valid] Ep. 3 : Up. 260000 : perplexity : 2.18651 : new best
[2021-04-13 01:29:56] Ep. 3 : Up. 270000 : Sen. 33,152,000 : Cost 0.31686807 * 1,546,803,123 @ 13,477 after 41,570,049,782 : Time 7617.21s : 25453.55 words/s : L.r. 7.3030e-05
[2021-04-13 01:29:57] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 01:29:58] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 01:30:01] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 01:30:08] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 01:30:09] [valid] Ep. 3 : Up. 270000 : perplexity : 2.18395 : new best
[2021-04-13 03:36:03] Ep. 3 : Up. 280000 : Sen. 41,096,689 : Cost 0.31580126 * 1,538,901,399 @ 24,387 after 43,108,951,181 : Time 7566.69s : 25428.48 words/s : L.r. 7.1714e-05
[2021-04-13 03:36:03] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 03:36:05] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 03:36:07] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 03:36:13] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 03:36:14] [valid] Ep. 3 : Up. 280000 : perplexity : 2.1804 : new best
[2021-04-13 05:42:22] Ep. 3 : Up. 290000 : Sen. 49,055,760 : Cost 0.31599799 * 1,539,976,265 @ 23,622 after 44,648,927,446 : Time 7579.18s : 25439.95 words/s : L.r. 7.0466e-05
[2021-04-13 05:42:22] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 05:42:24] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 05:42:26] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 05:42:32] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 05:42:33] [valid] Ep. 3 : Up. 290000 : perplexity : 2.17926 : new best
[2021-04-13 07:48:58] Ep. 3 : Up. 300000 : Sen. 57,026,608 : Cost 0.31556690 * 1,543,156,902 @ 14,832 after 46,192,084,348 : Time 7595.09s : 25429.63 words/s : L.r. 6.9282e-05
[2021-04-13 07:48:58] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 07:49:00] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 07:49:01] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 07:49:08] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 07:49:09] [valid] Ep. 3 : Up. 300000 : perplexity : 2.17514 : new best
[2021-04-13 09:55:31] Ep. 3 : Up. 310000 : Sen. 65,011,920 : Cost 0.31567365 * 1,543,413,751 @ 21,824 after 47,735,498,099 : Time 7593.40s : 25471.66 words/s : L.r. 6.8155e-05
[2021-04-13 09:55:31] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 09:55:33] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 09:55:35] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 09:55:42] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 09:55:42] [valid] Ep. 3 : Up. 310000 : perplexity : 2.1733 : new best
[2021-04-13 12:02:17] Ep. 3 : Up. 320000 : Sen. 73,002,709 : Cost 0.31527013 * 1,543,823,232 @ 15,555 after 49,279,321,331 : Time 7605.54s : 25414.73 words/s : L.r. 6.7082e-05
[2021-04-13 12:02:17] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 12:02:19] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 12:02:21] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 12:02:28] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 12:02:29] [valid] Ep. 3 : Up. 320000 : perplexity : 2.17204 : new best
[2021-04-13 14:08:32] Ep. 3 : Up. 330000 : Sen. 80,947,276 : Cost 0.31495792 * 1,537,202,843 @ 22,330 after 50,816,524,174 : Time 7575.18s : 25407.78 words/s : L.r. 6.6058e-05
[2021-04-13 14:08:32] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 14:08:34] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 14:08:36] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 14:08:42] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 14:08:43] [valid] Ep. 3 : Up. 330000 : perplexity : 2.17051 : new best
[2021-04-13 16:15:29] Ep. 3 : Up. 340000 : Sen. 88,952,868 : Cost 0.31548429 * 1,544,234,194 @ 20,283 after 52,360,758,368 : Time 7616.75s : 25446.45 words/s : L.r. 6.5079e-05
[2021-04-13 16:15:29] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 16:15:31] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 16:15:33] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 16:15:40] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 16:15:41] [valid] Ep. 3 : Up. 340000 : perplexity : 2.16648 : new best
[2021-04-13 16:46:46] Seen 90910822 samples
[2021-04-13 16:46:46] Starting data epoch 4 in logical epoch 4
[2021-04-13 16:46:46] [sqlite] Selecting shuffled data
[2021-04-13 18:23:07] Ep. 4 : Up. 350000 : Sen. 5,952,000 : Cost 0.31392208 * 1,531,321,712 @ 9,800 after 53,892,080,080 : Time 7657.78s : 25022.05 words/s : L.r. 6.4143e-05
[2021-04-13 18:23:07] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.orig.npz
[2021-04-13 18:23:09] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz
[2021-04-13 18:23:11] Saving Adam parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.optimizer.npz
[2021-04-13 18:23:18] Saving model weights and runtime parameters to /users/tiedeman/research/Opus-MT-train/work-tatoeba/eng-rus/opus+bt.spm32k-spm32k.transformer-align.model1.npz.best-perplexity.npz
[2021-04-13 18:23:19] [valid] Ep. 4 : Up. 350000 : perplexity : 2.16644 : new best