Apply for community grant: Personal project

#1
by Hazzzardous - opened

Looking to run 14B or 7B quantized rwkv model on a GPU instance. Should only require 16/8 GB of vram, respectively

Hi @Hazzzardous , we have assigned a gpu to this space. Note that GPU Grants are provided temporarily and might be removed after some time if the usage is very low.

To learn more about GPUs in Spaces, please check out https://huggingface.co/docs/hub/spaces-gpus

Sign up or log in to comment