cannot clone https://huggingface.co/TheBloke/Genz-70b-GPTQ, got killed

#1
by fabien-tarrade-axa-ch - opened

Hi there,

congrat for the nice work. we wanted to use it but we are unable to clone the repo. The issue seems to be the single 37 Gb file model.safetensors. I have no issue to clone Llama 2 70B models (total size >128 GB but maximum file size is 10 Gb)

git clone https://huggingface.co/TheBloke/Genz-70b-GPTQ
Cloning into ‘Genz-70b-GPTQ’...
remote: Enumerating objects: 109, done.
remote: Counting objects: 100% (106/106), done.
remote: Compressing objects: 100% (106/106), done.
remote: Total 109 (delta 43), reused 0 (delta 0), pack-reused 3
Receiving objects: 100% (109/109), 503.54 KiB | 2.75 MiB/s, done.
Resolving deltas: 100% (43/43), done.
Downloading model.safetensors (37 GB)
[1] 3116 killed git clone https://huggingface.co/TheBloke/Genz-70b-GPTQ

Is this a know issue ? Is there a trick to have it working ? Is there some reason to have one single file and not few shard ?

Thanks

Same issue using python transformer library:

ValueError: Could not load model TheBloke/Genz-70b-GPTQ with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>

Sign up or log in to comment