Spaces:
Running
title: README
emoji: π
colorFrom: indigo
colorTo: indigo
sdk: static
pinned: false
ZeroGPU Spaces
ZeroGPU is a new kind of hardware for Spaces.
It has two goals :
- Provide free GPU access for Spaces
- Allow Spaces to run on multiple GPUs
This is achieved by making Spaces efficiently hold and release GPUs as needed (as opposed to a classical GPU Space that holds exactly one GPU at any point in time)
Compatibility
ZeroGPU Spaces should mostly be compatible with any PyTorch-based GPU Space.
Compatibilty with high level HF libraries like transformers
or diffusers
is slightly more guaranteed
That said, ZeroGPU Spaces are not as broadly compatible as classical GPU Spaces and you might still encounter unexpected bugs
Also, for now, ZeroGPU Spaces only work with the Gradio SDK
Usage
In order to make your Space work with ZeroGPU you need to decorate the Python functions that actually require a GPU with @spaces.GPU
During the time when a decorated function is invoked, the Space will be attributed a GPU, and it will release it upon completion of the function.
Here is a practical example :
+import spaces
from diffusers import DiffusionPipeline
pipe = DiffusionPipeline.from_pretrained(...)
pipe.to('cuda')
+@spaces.GPU
def generate(prompt):
return pipe(prompt).images
gr.Interface(
fn=generate,
inputs=gr.Text(),
outputs=gr.Gallery(),
).launch()
- We first
import spaces
(importing it first might prevent some issues but is not mandatory) - Then we decorate the
generate
function by adding a@spaces.GPU
line before its definition
Early access
Feel free to join this organization if you want to try ZeroGPU as a Space author. β We should accept you shortly after checking your HF profile