Inference on TPU-v3-32

#69
by ybelkada HF staff - opened
BigScience Workshop org
edited Jul 29, 2022

Hi @zhiG !

Let's move the discussion here!
For BLOOM tpu inference you can check this repo: https://github.com/huggingface/bloom-jax-inference where we use Flax model partitioning
original discussion: https://huggingface.co/bigscience/bloom/discussions/68

Thank you very much for your information, I'll try to use it in TPU!

It seems like the JAX checkpoints of 176B BLOOM model are not available now, so we should wait.

Sign up or log in to comment