ValueError: not enough values to unpack (expected 3, got 2)

#2
by rhamnett - opened

Hello getting the following error when generating a response using the example in the Model card.

Fixed with inject_fused_attention=False πŸ‘πŸ‘

@rhamnett hello, got same error, where did u injeet this?

it goes in the .from_quantized() call

@TheBloke hello, I quantized a model, but can not load it with ctransformers

ValueError: No model file found in directory out/quantized_openbuddy-coder-34b-v11-bf16'

but your model is OK, why? am using safetensors

ctransformers? ctransformers is for GGML/GGUF models, not GPTQ models. You want normal Huggingface Transformers to load GPTQ models

but ctransformers also supports GPTQ via exallama

Sign up or log in to comment