stas commited on
Commit
f291fe9
1 Parent(s): 26235c1
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -58,7 +58,7 @@ Training a multilingual 176 billion parameters model in the open
58
 
59
  The training of BigScience’s main model started on **March 11, 2022 11:42am PST** and will continue for 3-4 months on 384 A100 80GB GPUs of the Jean Zay public supercomputer
60
 
61
- You can follow the training at [https://twitter.com/BigScienceLLM](https://twitter.com/BigScienceLLM)
62
 
63
  ## More information on the model, dataset, hardware, environmental consideration:
64
 
 
58
 
59
  The training of BigScience’s main model started on **March 11, 2022 11:42am PST** and will continue for 3-4 months on 384 A100 80GB GPUs of the Jean Zay public supercomputer
60
 
61
+ You can follow the training at [https://twitter.com/BigScienceLLM](https://twitter.com/BigScienceLLM) or on [the Tensorboards tab above](https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss).
62
 
63
  ## More information on the model, dataset, hardware, environmental consideration:
64