Update README.md
Browse files
README.md
CHANGED
@@ -737,13 +737,7 @@ Initial prompting experiments using interim checkpoints: https://huggingface.co/
|
|
737 |
</details>
|
738 |
|
739 |
|
740 |
-
## Original checkpoints
|
741 |
|
742 |
-
The checkpoints in this repo correspond to the HuggingFace Transformers format. If you want to use our fork of [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed) that the model was trained with, you'd want to use [this repo instead](https://huggingface.co/bigscience/bloom-optimizer-states).
|
743 |
-
|
744 |
-
Many intermediate checkpoints are available at https://huggingface.co/bigscience/bloom-intermediate/
|
745 |
-
|
746 |
-
---
|
747 |
|
748 |
# Model Card Authors
|
749 |
*Ordered roughly chronologically and by amount of time spent on creating this model card.*
|
|
|
737 |
</details>
|
738 |
|
739 |
|
|
|
740 |
|
|
|
|
|
|
|
|
|
|
|
741 |
|
742 |
# Model Card Authors
|
743 |
*Ordered roughly chronologically and by amount of time spent on creating this model card.*
|