The model 'RWGPTQForCausalLM' is not supported for text-generation.

#18
by herMaster - opened

After loading the model and tokenizer, I set text generation as the task in the pipeline. After loading the model, its description is -

RWGPTQForCausalLM(
(model): RWForCausalLM(
(transformer): RWModel(
(word_embeddings): Embedding(65024, 4544)
(h): ModuleList(
(0-31): 32 x DecoderLayer(
(input_layernorm): LayerNorm((4544,), eps=1e-05, elementwise_affine=True)
(self_attention): Attention(
(maybe_rotary): RotaryEmbedding()
(attention_dropout): Dropout(p=0.0, inplace=False)
(dense): GeneralQuantLinear(in_features=4544, out_features=4544, bias=True)
(query_key_value): GeneralQuantLinear(in_features=4544, out_features=4672, bias=True)
)
(mlp): MLP(
(act): GELU(approximate='none')
(dense_4h_to_h): GeneralQuantLinear(in_features=18176, out_features=4544, bias=True)
(dense_h_to_4h): GeneralQuantLinear(in_features=4544, out_features=18176, bias=True)
)
)
)
(ln_f): LayerNorm((4544,), eps=1e-05, elementwise_affine=True)
)
(lm_head): Linear(in_features=4544, out_features=65024, bias=False)
)
)

But when we set text generation as the task in the pipeline, the colab gives an error mentioned in the Title.

Sign up or log in to comment