add a link to https://huggingface.co/bigscience/bloom-intermediate/
Browse files
README.md
CHANGED
@@ -741,6 +741,8 @@ Initial prompting experiments using interim checkpoints: https://huggingface.co/
|
|
741 |
|
742 |
The checkpoints in this repo correspond to the HuggingFace Transformers format. If you want to use our fork of [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed) that the model was trained with, you'd want to use [this repo instead](https://huggingface.co/bigscience/bloom-optimizer-states).
|
743 |
|
|
|
|
|
744 |
---
|
745 |
|
746 |
# Model Card Authors
|
|
|
741 |
|
742 |
The checkpoints in this repo correspond to the HuggingFace Transformers format. If you want to use our fork of [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed) that the model was trained with, you'd want to use [this repo instead](https://huggingface.co/bigscience/bloom-optimizer-states).
|
743 |
|
744 |
+
Many intermediate checkpoints are available at https://huggingface.co/bigscience/bloom-intermediate/
|
745 |
+
|
746 |
---
|
747 |
|
748 |
# Model Card Authors
|