jeffra commited on
Commit
490e079
1 Parent(s): 5d7cc82

Deepspeed -> DeepSpeed

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -2,8 +2,8 @@
2
  license: bigscience-bloom-rail-1.0
3
  ---
4
 
5
- This is a custom version of the original [BLOOM weights](https://huggingface.co/bigscience/bloom) to make it fast to use with the [Deepspeed-Inference](https://www.deepspeed.ai/tutorials/inference-tutorial/) engine which uses Tensor Parallelism. In this repo the tensors are split into 8 shards to target 8 GPUs.
6
 
7
  The full BLOOM documentation is [here](https://huggingface.co/bigscience/bloom)
8
 
9
- To use the weights in repo, you can adapt to your needs the scripts found [here](https://github.com/bigscience-workshop/Megatron-DeepSpeed/tree/main/scripts/inference) (XXX: they are going to migrate soon to HF Transformers code base, so will need to update the link once moved)
 
2
  license: bigscience-bloom-rail-1.0
3
  ---
4
 
5
+ This is a custom version of the original [BLOOM weights](https://huggingface.co/bigscience/bloom) to make it fast to use with the [DeepSpeed-Inference](https://www.deepspeed.ai/tutorials/inference-tutorial/) engine which uses Tensor Parallelism. In this repo the tensors are split into 8 shards to target 8 GPUs.
6
 
7
  The full BLOOM documentation is [here](https://huggingface.co/bigscience/bloom)
8
 
9
+ To use the weights in repo, you can adapt to your needs the scripts found [here](https://github.com/bigscience-workshop/Megatron-DeepSpeed/tree/main/scripts/inference) (XXX: they are going to migrate soon to HF Transformers code base, so will need to update the link once moved)