cross link to meg-ds checkpoint
Browse files
README.md
CHANGED
@@ -2442,6 +2442,11 @@ Initial prompting experiments using interim checkpoints: https://huggingface.co/
|
|
2442 |
|
2443 |
</details>
|
2444 |
|
|
|
|
|
|
|
|
|
|
|
2445 |
---
|
2446 |
|
2447 |
# Model Card Authors
|
|
|
2442 |
|
2443 |
</details>
|
2444 |
|
2445 |
+
|
2446 |
+
## Original checkpoints
|
2447 |
+
|
2448 |
+
The checkpoints in this repo correspond to the HuggingFace Transformers format. If you want to use our fork of [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed) that the model was trained with, you'd want to use [this repo instead](https://huggingface.co/bigscience/bloom-optimizer-states).
|
2449 |
+
|
2450 |
---
|
2451 |
|
2452 |
# Model Card Authors
|