model_type for text-generation-webui?

#3
by feliscat - opened

This model only works when I set model_type to gptj in text-generation-webui. Does that make sense?

It's actually an OPT model. But I seem to recall that text-gen-ui doesn't have that option.

But GPTQ-for-LLaMA supports OPT models so I guess whatever text-gen-ui does when you choose GPTJ also works for OPT.

I never really tested or experimented with this model so don't have any experience myself.

But if it works, it works!

Got an error on running 'opt' model.
"
oobabooga_linux/installer_files/env/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py", line 211, in forward
attn_weights = torch.bmm(query_states, key_states.transpose(1, 2))"
RuntimeError: expected scalar type Half but found Float
"

BTW, it works in the gptj model,
but generated texts are quite unsatisfactory compared to the 13B model.
==> My bad. This model is specialized for scientific data. I asked irrelevant questions.

Sign up or log in to comment