NeMo
PyTorch
English
seq2seq
masked language modeling
MaximumEntropy commited on
Commit
d43eded
1 Parent(s): 002adce

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -56,7 +56,7 @@ git clone https://github.com/NVIDIA/NeMo.git
56
  cd NeMo/examples/nlp/language_modeling
57
  git checkout v1.11.0
58
  python megatron_t5_eval.py \
59
- --model_file /raid/Data/NMT/Models/t5_3b/megatron_t5-tp2--val_los-1.09-step-999999-consumed-samples-2159846144.0.nemo \
60
  --prompt '<mask> was the first person to set foot on the moon. When he did, he uttered the phrase <mask> for man, one <mask> for mankind which is still a popular quote today.' \
61
  --tensor_model_parallel_size 2
62
  ```
 
56
  cd NeMo/examples/nlp/language_modeling
57
  git checkout v1.11.0
58
  python megatron_t5_eval.py \
59
+ --model_file /raid/Data/NMT/Models/t5_3b/nemo_megatron_t5_3b_bf16_tp2.nemo \
60
  --prompt '<mask> was the first person to set foot on the moon. When he did, he uttered the phrase <mask> for man, one <mask> for mankind which is still a popular quote today.' \
61
  --tensor_model_parallel_size 2
62
  ```