Stopped loading

#1
by StableDiffusion69 - opened

Since the last update of oobabooga, the model stopped loading. It just stays on Loading checkpoint shards 0%
Any new settings needed to run this again (currently used Transformers)?
I would really like to use this model again with my 8GB card. It is wonderful.
Thanks!

Update:
In the meantime, I noticed, that it didn't sopped loading, it loads incredible slow. Normally, it takes about 1 to 2 minutes to load. But often now, it takes about 30 minutes. And that is not wen the model is changed, but right after loading oobabooga
Any idea why it takes so long sometimes? πŸ€”
I really love this model, it's great.

2023-06-21 18:15:24 INFO:Loading Imablank_P1GM4L10N-7B-MERGED_WEIGHTS...
2023-06-21 18:15:24 WARNING:Using the following 4-bit params: {'load_in_4bit': True, 'bnb_4bit_compute_dtype': torch.float16, 'bnb_4bit_quant_type': 'nf4', 'bnb_4bit_use_double_quant': False}
2023-06-21 18:15:26 WARNING:The model weights are not tied. Please use the tie_weights method before using the infer_auto_device function.
Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [29:41<00:00, 890.88s/it]
2023-06-21 18:45:09 INFO:Loaded the model in 1784.14 seconds.

Sign up or log in to comment