Token length 3908?

#1
by Yhyu13 - opened

while I am doing GPTQ quant for this model , I get error

Token indices sequence length is longer than the specified maximum sequence length for this model (3908 > 2048). Running this sequence through the model will result in indexing errors

Why is the token length changed?

Open Access AI Collective org

The token length is definitely the default 2048. See https://huggingface.co/openaccess-ai-collective/jeopardy-bot/blob/main/config.json

winglian changed discussion status to closed

Sign up or log in to comment